博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
tf.contrib.layers.fully_connected详解
阅读量:4145 次
发布时间:2019-05-25

本文共 2839 字,大约阅读时间需要 9 分钟。

博主

tf.contrib.layers.fully_connected(F, num_outputs,activation_fn

F ---[batch_size,images_pixels],tensor

num_outputs --- numbers of outputs,[batch_size,num_outputs]

activation_fn ---采用指定的非线性激励函数,默认不是None,如果不需要的话,要赋值None

 

 

API解释

tf.contrib.layers.fully_connected

tf.contrib.layers.fully_connected(    inputs,    num_outputs,    activation_fn=tf.nn.relu,    normalizer_fn=None,    normalizer_params=None,    weights_initializer=initializers.xavier_initializer(),    weights_regularizer=None,    biases_initializer=tf.zeros_initializer(),    biases_regularizer=None,    reuse=None,    variables_collections=None,    outputs_collections=None,    trainable=True,    scope=None)

Defined in .

See the guide: 

Adds a fully connected layer.

fully_connected creates a variable called weights, representing a fully connected weight matrix, which is multiplied by the inputs to produce a Tensor of hidden units. If a normalizer_fn is provided (such as batch_norm), it is then applied. Otherwise, if normalizer_fn is None and a biases_initializer is provided then a biases variable would be created and added the hidden units. Finally, if activation_fn is not None, it is applied to the hidden units as well.

Note: that if 
inputs have a rank greater than 2, then 
inputs is flattened prior to the initial matrix multiply by 
weights.

Args:

  • inputs: A tensor of at least rank 2 and static value for the last dimension; i.e. [batch_size, depth][None, None, None, channels].
  • num_outputs: Integer or long, the number of output units in the layer.
  • activation_fn: Activation function. The default value is a ReLU function. Explicitly set it to None to skip it and maintain a linear activation.
  • normalizer_fn: Normalization function to use instead of biases. If normalizer_fn is provided then biases_initializer and biases_regularizer are ignored and biases are not created nor added. default set to None for no normalizer function
  • normalizer_params: Normalization function parameters.
  • weights_initializer: An initializer for the weights.
  • weights_regularizer: Optional regularizer for the weights.
  • biases_initializer: An initializer for the biases. If None skip biases.
  • biases_regularizer: Optional regularizer for the biases.
  • reuse: Whether or not the layer and its variables should be reused. To be able to reuse the layer scope must be given.
  • variables_collections: Optional list of collections for all the variables or a dictionary containing a different list of collections per variable.
  • outputs_collections: Collection to add the outputs.
  • trainable: If True also add variables to the graph collection GraphKeys.TRAINABLE_VARIABLES (see tf.Variable).
  • scope: Optional scope for variable_scope.

Returns:

The tensor variable representing the result of the series of operations.

Raises:

  • ValueError: If x has rank less than 2 or if its last dimension is not set.

 

 

转载地址:http://mifti.baihongyu.com/

你可能感兴趣的文章
Qt札记
查看>>
我的vimrc和gvimrc配置
查看>>
hdu 4280
查看>>
禁止使用类的copy构造函数和赋值操作符
查看>>
C++学习路线
查看>>
私有构造函数
查看>>
组队总结
查看>>
TitledBorder 设置JPanel边框
查看>>
DBCP——开源组件 的使用
查看>>
抓包工具
查看>>
海量数据相似度计算之simhash和海明距离
查看>>
DeepLearning tutorial(5)CNN卷积神经网络应用于人脸识别(详细流程+代码实现)
查看>>
DeepLearning tutorial(6)易用的深度学习框架Keras简介
查看>>
DeepLearning tutorial(7)深度学习框架Keras的使用-进阶
查看>>
流形学习-高维数据的降维与可视化
查看>>
Python-OpenCV人脸检测(代码)
查看>>
python+opencv之视频人脸识别
查看>>
人脸识别(OpenCV+Python)
查看>>
6个强大的AngularJS扩展应用
查看>>
网站用户登录系统设计——jsGen实现版
查看>>