函数原型:
tf.layers.dense(
inputs,
units,
activation=None,
use_bias=True,
kernel_initializer=None,
bias_initializer=tf.zeros_initializer(),
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
trainable=True,
name=None,
reuse=None
)
举例:
假设有维度为4 的输入向量 (3, 4, 5, 6),要经过全连接层计算得到输出为2的向量, 对应的权重为(1, 1, 1, 1) 和 (2, 2, 2, 2), 全连接层的计算示意图如下:
矩阵表示为:
o u t p u t = i n p u t × W T = [ 3 4 5 6 ] [ 1 2 1 2 1 2 1 2 ] = [ 18 36 ] output=input \times W^T= \left[ \begin{matrix} 3 & 4 & 5 & 6 \\ \end{matrix} \right] \left[ \begin{matrix} 1 & 2 \\ 1 & 2\\ 1 & 2\\ 1 & 2\\ \end{matrix} \right] = \left[ \begin{matrix} 18 & 36 \\ \end{matrix} \right] output=input×WT=[3456]⎣⎢⎢⎡11112222⎦⎥⎥⎤=[1836]
import tensorflow as tf
import numpy as np
# input_shape = [C, W, H]
input = [[1, 2, 3, 4]]
# input_shape = [input_channel;, width, height]
weights = [[1, 1, 1, 1],
[2, 2, 2, 2]]
input_tf = tf.constant(input, dtype=tf.float32)
weights = np.transpose(weights, (1, 0))
weights_init = tf.constant_initializer(weights)
output_tf = tf.layers.dense(input_tf, 2, kernel_initializer=weights_init)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
output = sess.run(output_tf)
print(output)
# 输出为:
# [[10. 20.]]