这是目前最后一篇关于 TF 的了,话说终于到这篇了……

0x00.前言

0x01.引用

1.0 TensorFlow相关函数理解

1.1 tf.truncated_normal

1
truncated_normal(
2
    shape,
3
    mean=0.0,
4
    stddev=1.0,
5
    dtype=tf.float32,
6
    seed=None,
7
    name=None
8
)

功能说明:
产生截断正态分布随机数,取值范围为[mean - 2 * stddev, mean + 2 * stddev]
参数列表:

参数名必选类型说明
shape1 维整形张量或array输出张量的维度
mean0 维张量或数值均值
stddev0 维张量或数值标准差
dtypedtype输出类型
seed数值随机种子,若seed赋值,每次产生相同随机数
namestring运算名称

现在您可以在/home/ubuntu目录下创建源文件truncated_normal.py

1
#!/usr/bin/python
2
3
import tensorflow as tf
4
initial = tf.truncated_normal(shape=[3,3], mean=0, stddev=1)
5
print tf.Session().run(initial)

然后执行:
python /home/ubuntu/truncated_normal.py
执行结果:
将得到一个取值范围[-2, 2]3 * 3矩阵

1.2 tf.constant

1
constant(
2
    value,
3
    dtype=None,
4
    shape=None,
5
    name='Const',
6
    verify_shape=False
7
)

功能说明:
根据value的值生成一个shape维度的常量张量
参数列表:

参数名必选类型说明
value常量数值或者list输出张量的值
dtypedtype输出张量元素类型
shape1 维整形张量或array输出张量的维度
namestring张量名称
verify_shapeBoolean检测shape是否和valueshape一致,若为False,不一致时,会用最后一个元素将shape补全

现在您可以在/home/ubuntu目录下创建源文件constant.py,内容可参考:

1
#!/usr/bin/python
2
3
import tensorflow as tf
4
import numpy as np
5
a = tf.constant([1,2,3,4,5,6],shape=[2,3])
6
b = tf.constant(-1,shape=[3,2])
7
c = tf.matmul(a,b)
8
9
e = tf.constant(np.arange(1,13,dtype=np.int32),shape=[2,2,3])
10
f = tf.constant(np.arange(13,25,dtype=np.int32),shape=[2,3,2])
11
g = tf.matmul(e,f)
12
with tf.Session() as sess:
13
    print sess.run(a)
14
    print ("##################################")
15
    print sess.run(b)
16
    print ("##################################")
17
    print sess.run(c)
18
    print ("##################################")
19
    print sess.run(e)
20
    print ("##################################")
21
    print sess.run(f)
22
    print ("##################################")
23
    print sess.run(g)

然后执行:
python /home/ubuntu/constant.py
执行结果:

1
a: 2x3 维张量;
2
b: 3x2 维张量;
3
c: 2x2 维张量;
4
e: 2x2x3 维张量;
5
f: 2x3x2 维张量;
6
g: 2x2x2 维张量。

1.3 tf.placeholder

1
placeholder(
2
    dtype,
3
    shape=None,
4
    name=None
5
)

功能说明:
是一种占位符,在执行时候需要为其提供数据
参数列表:

参数名必选类型说明
dtypedtype占位符数据类型
shape1 维整形张量或array占位符维度
namestring占位符名称

现在您可以在/home/ubuntu目录下创建源文件placeholder.py,内容可参考:

1
#!/usr/bin/python
2
3
import tensorflow as tf
4
import numpy as np
5
6
x = tf.placeholder(tf.float32,[None,10])
7
y = tf.matmul(x,x)
8
with tf.Session() as sess:
9
    rand_array = np.random.rand(10,10)
10
    print sess.run(y,feed_dict={x:rand_array})

然后执行:
python /home/ubuntu/placeholder.py
执行结果:
输出一个10x10维的张量

1.4 tf.nn.bias_add

1
bias_add(
2
    value,
3
    bias,
4
    data_format=None,
5
    name=None
6
)

功能说明:
将偏差项bias加到value上面,可以看做是tf.add的一个特例,其中bias必须是一维的,并且维度和value的最后一维相同,数据类型必须和value相同
参数列表:

参数名必选类型说明
value张量数据类型为 float, double, int64, int32, uint8, int16, int8, complex64, or complex128
bias1 维张量维度必须和value最后一维维度相等
data_formatstring数据格式,支持NHWCNCHW
namestring运算名称

现在您可以在/home/ubuntu目录下创建源文件bias_add.py,内容可参考:

1
#!/usr/bin/python
2
3
import tensorflow as tf
4
import numpy as np
5
6
a = tf.constant([[1.0, 2.0],[1.0, 2.0],[1.0, 2.0]])
7
b = tf.constant([2.0,1.0])
8
c = tf.constant([1.0])
9
sess = tf.Session()
10
print sess.run(tf.nn.bias_add(a, b)) 
11
#print sess.run(tf.nn.bias_add(a,c)) error
12
print ("##################################")
13
print sess.run(tf.add(a, b))
14
print ("##################################")
15
print sess.run(tf.add(a, c))

然后执行:
python /home/ubuntu/bias_add.py
执行结果:
33x2维张量

1.5 tf.reduce_mean

1
reduce_mean(
2
    input_tensor,
3
    axis=None,
4
    keep_dims=False,
5
    name=None,
6
    reduction_indices=None
7
)

功能说明:
计算张量input_tensor平均值
参数列表:

参数名必选类型说明
input_tensor张量输入待求平均值的张量
axisNone01None:全局求平均值;0:求每一列平均值;1:求每一行平均值
keep_dimsBoolean保留原来的维度,降为1
namestring运算名称
reduction_indicesNoneaxis等价,被弃用

现在您可以在/home/ubuntu目录下创建源文件reduce_mean.py,内容可参考:

1
#!/usr/bin/python
2
3
import tensorflow as tf
4
import numpy as np
5
6
initial = [[1.,1.],[2.,2.]]
7
x = tf.Variable(initial,dtype=tf.float32)
8
init_op = tf.global_variables_initializer()
9
with tf.Session() as sess:
10
    sess.run(init_op)
11
    print sess.run(tf.reduce_mean(x))
12
    print sess.run(tf.reduce_mean(x,0)) #Column
13
    print sess.run(tf.reduce_mean(x,1)) #row

然后执行:
python /home/ubuntu/reduce_mean.py
执行结果:

1
1.5
2
[ 1.5  1.5]
3
[ 1.  2.]

1.6 tf.squared_difference

1
squared_difference(
2
    x,
3
    y,
4
    name=None
5
)

功能说明:
计算张量xy对应元素差平方
参数列表:

参数名必选类型说明
x张量half, float32, float64, int32, int64, complex64, complex128其中一种类型
y张量half, float32, float64, int32, int64, complex64, complex128其中一种类型
namestring运算名称

现在您可以在/home/ubuntu目录下创建源文件squared_difference.py,内容可参考:

1
#!/usr/bin/python
2
3
import tensorflow as tf
4
import numpy as np
5
6
initial_x = [[1.,1.],[2.,2.]]
7
x = tf.Variable(initial_x,dtype=tf.float32)
8
initial_y = [[3.,3.],[4.,4.]]
9
y = tf.Variable(initial_y,dtype=tf.float32)
10
diff = tf.squared_difference(x,y)
11
init_op = tf.global_variables_initializer()
12
with tf.Session() as sess:
13
    sess.run(init_op)
14
    print sess.run(diff)

然后执行:
python /home/ubuntu/squared_difference.py
执行结果:

1
[[ 4.  4.]
2
 [ 4.  4.]]

1.7 tf.square

1
square(
2
    x,
3
    name=None
4
)

功能说明:
计算张量对应元素平方
参数列表:

参数名必选类型说明
x张量half, float32, float64, int32, int64, complex64, complex128其中一种类型
namestring运算名称

现在您可以在/home/ubuntu目录下创建源文件square.py,内容可参考:

1
#!/usr/bin/python
2
import tensorflow as tf
3
import numpy as np
4
5
initial_x = [[1.,1.],[2.,2.]]
6
x = tf.Variable(initial_x,dtype=tf.float32)
7
x2 = tf.square(x)
8
init_op = tf.global_variables_initializer()
9
with tf.Session() as sess:
10
    sess.run(init_op)
11
    print sess.run(x2)

然后执行:
python /home/ubuntu/square.py
执行结果:

1
[[ 1.  1.]
2
 [ 4.  4.]]

2.0 TensorFlow相关类理解

2.1 tf.Variable

1
__init__(
2
    initial_value=None,
3
    trainable=True,
4
    collections=None,
5
    validate_shape=True,
6
    caching_device=None,
7
    name=None,
8
    variable_def=None,
9
    dtype=None,
10
    expected_shape=None,
11
    import_scope=None
12
)

功能说明:
维护图在执行过程中的状态信息,例如神经网络权重值的变化
参数列表:

参数名类型说明
initial_value张量Variable类的初始值,这个变量必须指定shape信息,否则后面validate_shape需设为False
trainableBoolean是否把变量添加到 collection GraphKeys.TRAINABLE_VARIABLES 中(collection 是一种全局存储,不受变量名生存空间影响,一处保存,到处可取)
collectionsGraph collections全局存储,默认是GraphKeys.GLOBAL_VARIABLES
validate_shapeBoolean是否允许被未知维度的initial_value初始化
caching_devicestring指明哪个device用来缓存变量
namestring变量名
dtypedtype如果被设置,初始化的值就会按照这个类型初始化
expected_shapeTensorShape要是设置了,那么初始的值会是这种维度

现在您可以在/home/ubuntu目录下创建源文件Variable.py,内容可参考:

1
#!/usr/bin/python
2
3
import tensorflow as tf
4
initial = tf.truncated_normal(shape=[10,10],mean=0,stddev=1)
5
W=tf.Variable(initial)
6
list = [[1.,1.],[2.,2.]]
7
X = tf.Variable(list,dtype=tf.float32)
8
init_op = tf.global_variables_initializer()
9
with tf.Session() as sess:
10
    sess.run(init_op)
11
    print ("##################(1)################")
12
    print sess.run(W)
13
    print ("##################(2)################")
14
    print sess.run(W[:2,:2])
15
    op = W[:2,:2].assign(22.*tf.ones((2,2)))
16
    print ("###################(3)###############")
17
    print sess.run(op)
18
    print ("###################(4)###############")
19
    print (W.eval(sess)) #computes and returns the value of this variable
20
    print ("####################(5)##############")
21
    print (W.eval())  #Usage with the default session
22
    print ("#####################(6)#############")
23
    print W.dtype
24
    print sess.run(W.initial_value)
25
    print sess.run(W.op)
26
    print W.shape
27
    print ("###################(7)###############")
28
    print sess.run(X)

然后执行:
python /home/ubuntu/Variable.py

1
ubuntu@VM-45-55-ubuntu:~$ python /home/ubuntu/Variable.py
2
##################(1)################
3
[[-0.05158469  0.42488426 -1.06051874  0.05041981 -0.59257025  0.75912011
4
   0.13238901  1.4264127   0.3660301  -0.34660342]
5
 [-0.58076793 -0.34156471  1.80603182 -0.63527924 -1.37761962  0.23985045
6
  -0.9572925   0.5855329  -1.52534127  0.66485882]
7
 [ 0.95287526 -0.52085191 -0.6662432   0.92799437 -0.14051931  0.77191192
8
  -0.40517998  1.15190434 -0.67737275 -0.49324712]
9
 [ 0.13710392 -0.26966634 -0.31862086  0.62378079  0.99250805  1.79186082
10
   0.24381292 -0.65113115 -0.31242973  0.96655703]
11
 [ 1.51818967  1.4847064  -1.04498291 -1.19972205  1.12664723  0.45897952
12
   1.30146337 -0.07071129  1.28198421 -0.07462779]
13
 [ 0.06365386 -1.37174654 -0.45393857  0.44872424  0.30701965 -0.33525467
14
   1.23019528  0.2688064  -0.77721894  1.15218246]
15
 [ 0.5284161  -0.57362115 -1.31496811  0.557841    1.38116109  1.11097515
16
   1.79387271  1.03924    -0.43662316  1.2135427 ]
17
 [ 0.12842607  0.55358696  0.50601929  0.15238616  0.30852544 -0.07885797
18
  -0.18290153 -0.65053511  0.06731477 -1.81053722]
19
 [ 0.0353244  -0.61836213 -0.02346812  0.73654675  1.96743298 -1.1408062
20
   1.58433104 -0.50077403 -1.70408487 -0.78402525]
21
 [-0.3279908   0.34578505 -0.4665527   0.71424776  0.48050362 -0.6924966
22
   0.05213421 -0.02890863  1.6275624  -1.1187917 ]]
23
##################(2)################
24
[[-0.05158469  0.42488426]
25
 [-0.58076793 -0.34156471]]
26
###################(3)###############
27
[[ 22.          22.          -1.06051874   0.05041981  -0.59257025
28
    0.75912011   0.13238901   1.4264127    0.3660301   -0.34660342]
29
 [ 22.          22.           1.80603182  -0.63527924  -1.37761962
30
    0.23985045  -0.9572925    0.5855329   -1.52534127   0.66485882]
31
 [  0.95287526  -0.52085191  -0.6662432    0.92799437  -0.14051931
32
    0.77191192  -0.40517998   1.15190434  -0.67737275  -0.49324712]
33
 [  0.13710392  -0.26966634  -0.31862086   0.62378079   0.99250805
34
    1.79186082   0.24381292  -0.65113115  -0.31242973   0.96655703]
35
 [  1.51818967   1.4847064   -1.04498291  -1.19972205   1.12664723
36
    0.45897952   1.30146337  -0.07071129   1.28198421  -0.07462779]
37
 [  0.06365386  -1.37174654  -0.45393857   0.44872424   0.30701965
38
   -0.33525467   1.23019528   0.2688064   -0.77721894   1.15218246]
39
 [  0.5284161   -0.57362115  -1.31496811   0.557841     1.38116109
40
    1.11097515   1.79387271   1.03924     -0.43662316   1.2135427 ]
41
 [  0.12842607   0.55358696   0.50601929   0.15238616   0.30852544
42
   -0.07885797  -0.18290153  -0.65053511   0.06731477  -1.81053722]
43
 [  0.0353244   -0.61836213  -0.02346812   0.73654675   1.96743298
44
   -1.1408062    1.58433104  -0.50077403  -1.70408487  -0.78402525]
45
 [ -0.3279908    0.34578505  -0.4665527    0.71424776   0.48050362
46
   -0.6924966    0.05213421  -0.02890863   1.6275624   -1.1187917 ]]
47
###################(4)###############
48
[[ 22.          22.          -1.06051874   0.05041981  -0.59257025
49
    0.75912011   0.13238901   1.4264127    0.3660301   -0.34660342]
50
 [ 22.          22.           1.80603182  -0.63527924  -1.37761962
51
    0.23985045  -0.9572925    0.5855329   -1.52534127   0.66485882]
52
 [  0.95287526  -0.52085191  -0.6662432    0.92799437  -0.14051931
53
    0.77191192  -0.40517998   1.15190434  -0.67737275  -0.49324712]
54
 [  0.13710392  -0.26966634  -0.31862086   0.62378079   0.99250805
55
    1.79186082   0.24381292  -0.65113115  -0.31242973   0.96655703]
56
 [  1.51818967   1.4847064   -1.04498291  -1.19972205   1.12664723
57
    0.45897952   1.30146337  -0.07071129   1.28198421  -0.07462779]
58
 [  0.06365386  -1.37174654  -0.45393857   0.44872424   0.30701965
59
   -0.33525467   1.23019528   0.2688064   -0.77721894   1.15218246]
60
 [  0.5284161   -0.57362115  -1.31496811   0.557841     1.38116109
61
    1.11097515   1.79387271   1.03924     -0.43662316   1.2135427 ]
62
 [  0.12842607   0.55358696   0.50601929   0.15238616   0.30852544
63
   -0.07885797  -0.18290153  -0.65053511   0.06731477  -1.81053722]
64
 [  0.0353244   -0.61836213  -0.02346812   0.73654675   1.96743298
65
   -1.1408062    1.58433104  -0.50077403  -1.70408487  -0.78402525]
66
 [ -0.3279908    0.34578505  -0.4665527    0.71424776   0.48050362
67
   -0.6924966    0.05213421  -0.02890863   1.6275624   -1.1187917 ]]
68
####################(5)##############
69
[[ 22.          22.          -1.06051874   0.05041981  -0.59257025
70
    0.75912011   0.13238901   1.4264127    0.3660301   -0.34660342]
71
 [ 22.          22.           1.80603182  -0.63527924  -1.37761962
72
    0.23985045  -0.9572925    0.5855329   -1.52534127   0.66485882]
73
 [  0.95287526  -0.52085191  -0.6662432    0.92799437  -0.14051931
74
    0.77191192  -0.40517998   1.15190434  -0.67737275  -0.49324712]
75
 [  0.13710392  -0.26966634  -0.31862086   0.62378079   0.99250805
76
    1.79186082   0.24381292  -0.65113115  -0.31242973   0.96655703]
77
 [  1.51818967   1.4847064   -1.04498291  -1.19972205   1.12664723
78
    0.45897952   1.30146337  -0.07071129   1.28198421  -0.07462779]
79
 [  0.06365386  -1.37174654  -0.45393857   0.44872424   0.30701965
80
   -0.33525467   1.23019528   0.2688064   -0.77721894   1.15218246]
81
 [  0.5284161   -0.57362115  -1.31496811   0.557841     1.38116109
82
    1.11097515   1.79387271   1.03924     -0.43662316   1.2135427 ]
83
 [  0.12842607   0.55358696   0.50601929   0.15238616   0.30852544
84
   -0.07885797  -0.18290153  -0.65053511   0.06731477  -1.81053722]
85
 [  0.0353244   -0.61836213  -0.02346812   0.73654675   1.96743298
86
   -1.1408062    1.58433104  -0.50077403  -1.70408487  -0.78402525]
87
 [ -0.3279908    0.34578505  -0.4665527    0.71424776   0.48050362
88
   -0.6924966    0.05213421  -0.02890863   1.6275624   -1.1187917 ]]
89
#####################(6)#############
90
<dtype: 'float32_ref'>
91
[[ 0.35549659 -0.92845166  0.7202518  -1.08173835 -0.56052214 -1.79995739
92
  -1.23022497  1.78744531  0.26768067  1.44654143]
93
 [ 1.00125992  0.88891822 -0.83442372 -0.51755071  0.93480241 -0.62580359
94
  -0.42888054  0.60265911  0.23383677  0.25027233]
95
 [ 0.62767732  1.49130106  0.11455932  0.8136881   0.1653619  -0.03023815
96
  -0.81600904  0.21061133  0.77372617 -1.05311072]
97
 [ 0.37356022  0.80606896 -0.77602631  1.7510792   1.17032671 -1.59365809
98
   0.81380212 -0.80985826 -0.5826512  -0.68983918]
99
 [ 1.5539794  -0.82919389 -0.37634259 -0.04195082  0.00483348 -1.6610924
100
   1.61947238  0.44739676  0.96909785  0.30437273]
101
 [-1.67946744  0.13453422  1.16949022 -1.07361639  0.16278958  0.48993936
102
   0.79800332 -0.59556031  1.02015698  0.61534965]
103
 [ 1.91761112  0.57116741 -1.32458746 -0.83711451 -0.23092926  0.09989663
104
  -0.13043015  0.39024881 -0.39114812 -1.34013951]
105
 [ 0.42324749  1.76086545 -1.64871371 -0.25146225  0.56552815 -0.22099398
106
   0.3763651  -0.26513788  0.09395658 -0.51482815]
107
 [-1.58338928  0.34144643 -0.60781646 -0.3217389  -0.36381459 -0.09845187
108
  -0.86982977  0.56992447  0.35818082 -1.13524997]
109
 [-1.17181849  0.15299995 -0.94315332  0.3065263  -0.33332458  1.59554768
110
   0.27707765  0.4924351   1.13253677 -0.55417466]]
111
None
112
(10, 10)
113
###################(7)###############
114
[[ 1.  1.]
115
 [ 2.  2.]]

0x02.后记

emmm……刚开始感觉好慢,到后来猝不及防就没了……

未完待续……