Layers

Layers (contrib)

Ops for building neural network layers, regularizers, summaries, etc.

Higher level ops for building neural network layers

This package provides several ops that take care of creating variables that are used internally in a consistent way and provide the building blocks for many common machine learning algorithms.

Aliases for fully_connected which set a default activation function are available: relu, relu6 and linear.

stack operation is also available. It builds a stack of layers by applying a layer repeatedly.

Regularizers

Regularization can help prevent overfitting. These have the signature fn(weights). The loss is typically added to tf.GraphKeys.REGULARIZATION_LOSSES.

Initializers

Initializers are used to initialize variables with sensible values given their size, data type, and purpose.

Optimization

Optimize weights given a loss.

Summaries

Helper functions to summarize specific variables or ops.

The layers module defines convenience functions summarize_variables, summarize_weights and summarize_biases, which set the collection argument of summarize_collection to VARIABLES, WEIGHTS and BIASES, respectively.

Feature columns

Feature columns provide a mechanism to map data to a model.

© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_guides/python/contrib.layers

在线笔记
App下载
App下载

扫描二维码

下载编程狮App

公众号
微信公众号

编程狮公众号

意见反馈
返回顶部