https://www.tensorflow.org/api_docs/python/tf/contrib/layers/fully_connected

fully_connected:

1、先根据权重得到输出

2、对输出normalizer (BN..)

3、对输出activate(relu, ...)

Defined in tensorflow/contrib/layers/python/layers/layers.py.

def fully_connected(inputs,
num_outputs,
activation_fn=nn.relu,
normalizer_fn=None,
normalizer_params=None,
weights_initializer=initializers.xavier_initializer(),
weights_regularizer=None,
biases_initializer=init_ops.zeros_initializer(),
biases_regularizer=None,
reuse=None,
variables_collections=None,
outputs_collections=None,
trainable=True,
scope=None):
"""Adds a fully connected layer. `fully_connected` creates a variable called `weights`, representing a fully
connected weight matrix, which is multiplied by the `inputs` to produce a
`Tensor` of hidden units. If a `normalizer_fn` is provided (such as
`batch_norm`), it is then applied. Otherwise, if `normalizer_fn` is
None and a `biases_initializer` is provided then a `biases` variable would be
created and added the hidden units. Finally, if `activation_fn` is not `None`,
it is applied to the hidden units as well. Note: that if `inputs` have a rank greater than 2, then `inputs` is flattened
prior to the initial matrix multiply by `weights`. Args:
inputs: A tensor of at least rank 2 and static value for the last dimension;
i.e. `[batch_size, depth]`, `[None, None, None, channels]`.
num_outputs: Integer or long, the number of output units in the layer.
activation_fn: Activation function. The default value is a ReLU function.
Explicitly set it to None to skip it and maintain a linear activation.
normalizer_fn: Normalization function to use instead of `biases`. If
`normalizer_fn` is provided then `biases_initializer` and
`biases_regularizer` are ignored and `biases` are not created nor added.
default set to None for no normalizer function
normalizer_params: Normalization function parameters.
weights_initializer: An initializer for the weights.
weights_regularizer: Optional regularizer for the weights.
biases_initializer: An initializer for the biases. If None skip biases.
biases_regularizer: Optional regularizer for the biases.
reuse: Whether or not the layer and its variables should be reused. To be
able to reuse the layer scope must be given.
variables_collections: Optional list of collections for all the variables or
a dictionary containing a different list of collections per variable.
outputs_collections: Collection to add the outputs.
trainable: If `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see tf.Variable).
scope: Optional scope for variable_scope. Returns:
The tensor variable representing the result of the series of operations. Raises:
ValueError: If x has rank less than 2 or if its last dimension is not set.
"""
if not isinstance(num_outputs, six.integer_types):
raise ValueError('num_outputs should be int or long, got %s.' %
(num_outputs,)) layer_variable_getter = _build_variable_getter({
'bias': 'biases',
'kernel': 'weights'
}) with variable_scope.variable_scope(
scope,
'fully_connected', [inputs],
reuse=reuse,
custom_getter=layer_variable_getter) as sc:
inputs = ops.convert_to_tensor(inputs)
layer = core_layers.Dense(
units=num_outputs,
activation=None,
use_bias=not normalizer_fn and biases_initializer,
kernel_initializer=weights_initializer,
bias_initializer=biases_initializer,
kernel_regularizer=weights_regularizer,
bias_regularizer=biases_regularizer,
activity_regularizer=None,
trainable=trainable,
name=sc.name,
dtype=inputs.dtype.base_dtype,
_scope=sc,
_reuse=reuse)
outputs = layer.apply(inputs) # Add variables to collections.
_add_variable_to_collections(layer.kernel, variables_collections, 'weights')
if layer.bias is not None:
_add_variable_to_collections(layer.bias, variables_collections, 'biases') # Apply normalizer function / layer.
if normalizer_fn is not None:
if not normalizer_params:
normalizer_params = {}
outputs = normalizer_fn(outputs, **normalizer_params) if activation_fn is not None:
outputs = activation_fn(outputs) return utils.collect_named_outputs(outputs_collections, sc.name, outputs)

tensorflow Method源码阅读之 fully_connected的更多相关文章

  1. 【原】AFNetworking源码阅读(四)

    [原]AFNetworking源码阅读(四) 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 上一篇还遗留了很多问题,包括AFURLSessionManagerTaskDe ...

  2. 【原】AFNetworking源码阅读(二)

    [原]AFNetworking源码阅读(二) 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 上一篇中我们在iOS Example代码中提到了AFHTTPSessionMa ...

  3. 【原】SDWebImage源码阅读(一)

    [原]SDWebImage源码阅读(一) 本文转载请注明出处 —— polobymulberry-博客园 1. 前言 一直没有系统地读过整套源码,就感觉像一直看零碎的知识点,没有系统读过一本专业经典书 ...

  4. 源码阅读系列:EventBus

    title: 源码阅读系列:EventBus date: 2016-12-22 16:16:47 tags: 源码阅读 --- EventBus 是人们在日常开发中经常会用到的开源库,即使是不直接用的 ...

  5. EventBus源码解析 源码阅读记录

    EventBus源码阅读记录 repo地址: greenrobot/EventBus EventBus的构造 双重加锁的单例. static volatile EventBus defaultInst ...

  6. angular源码阅读,依赖注入的原理:injector,provider,module之间的关系。

    最开始使用angular的时候,总是觉得它的依赖注入方式非常神奇. 如果你跳槽的时候对新公司说,我曾经使用过angular,那他们肯定会问你angular的依赖注入原理是什么? 这篇博客其实是angu ...

  7. CI框架源码阅读笔记4 引导文件CodeIgniter.php

    到了这里,终于进入CI框架的核心了.既然是“引导”文件,那么就是对用户的请求.参数等做相应的导向,让用户请求和数据流按照正确的线路各就各位.例如,用户的请求url: http://you.host.c ...

  8. CI框架源码阅读笔记3 全局函数Common.php

    从本篇开始,将深入CI框架的内部,一步步去探索这个框架的实现.结构和设计. Common.php文件定义了一系列的全局函数(一般来说,全局函数具有最高的加载优先权,因此大多数的框架中BootStrap ...

  9. parseInt的源码阅读

    parseInt的源码阅读 Integer.parseInt()这个方法的功能小巧又实用,实现起来困难不大,没有很复杂.这里就来看一下Java的源码是怎么写的吧,走一边大婶写过的代码,应该会有点收获吧 ...

随机推荐

  1. Type '' cannot conform to protocol '' because it has requirements that cannot be satisfied

    我有一个Objective-C协议,我试图在Swift类中实现.例如: @class AnObjcClass; @protocol ObjcProtocol <NSObject> - (v ...

  2. 一道CTF题引发的思考——SSI注入

    题目地址:http://210.32.4.22/index.php 一开始我一直考虑的用<!--#include file="文件"-->的格式进行读取文件,但是一直不 ...

  3. navicat 将自增长字段重置(重新从1开始)的方法

    先说明,此语句会将你的表中数据全部删除. 很简单,运行如下sql语句: TRUNCATE TABLE 表名;

  4. nginx内置变量总结

    nginx内置变量 2019-02-28 变量名称 变量用途 $atg_PARAMETER      客户端GET请求中   PARAMETER字段的值                        ...

  5. LeetCode算法题-Range Addition II(Java实现)

    这是悦乐书的第271次更新,第285篇原创 01 看题和准备 今天介绍的是LeetCode算法题中Easy级别的第138题(顺位题号是598).给定一个m行n列的新二维数组M,其初始值为0.提供一个二 ...

  6. 《精通Spring+4.x++企业应用开发实战》读后感

    引言 还记得大三时上培训班的是时候,当时的培训老师说自己是本地讲解spring最好的讲师,但是后来等我实习了看了<Spring 3.x 企业应用开发实战>以及后续版本<精通Sprin ...

  7. SpringBoot学习笔记(2) Spring Boot的一些配置

    外部配置 Spring Boot允许使用properties文件.yaml文件或者命令行参数作为外部配置 使用@Value注解,可以直接将属性值注入到你的beans中,并通过Spring的Enviro ...

  8. leetcode 263. Ugly Number 、264. Ugly Number II 、313. Super Ugly Number 、204. Count Primes

    263. Ugly Number 注意:1.小于等于0都不属于丑数 2.while循环的判断不是num >= 0, 而是能被2 .3.5整除,即能被整除才去除这些数 class Solution ...

  9. 自己编写 EntityTypeConfiguration

    1.新建类库 EFCore.EntityTypeConfig ,安装nuget  PM> Install-Package Microsoft.EntityFrameworkCore 2.新建接口 ...

  10. webservice调用和生成

    webservice简介: Web Service技术, 能使得运行在不同机器上的不同应用无须借助附加的.专门的第三方软件或硬件, 就可相互交换数据或集成.依据Web Service规范实施的应用之间 ...