模拟数据集上训练神经网络,网络解决二分类问题练习

#2018-06-24 395218 June Sunday the 25 week, the 175 day SZ
'''
模拟数据集上训练神经网络,网络解决二分类问题。

'''

import tensorflow as tf 
from numpy.random import RandomState #Numpy是一个科学计算工具包,这里用来生成随机数
#每批次需要8组数据
batch_size = 8
#声明参数w1,随机2*3的矩阵,标准差为2,随机种子为1.
w1 = tf.Variable(tf.random_normal([2,3], stddev = 1, seed =1))
w2 = tf.Variable(tf.random_normal([3,1], stddev = 1, seed = 1))
#print('w1 is:', w1)
#print('w2 is:', w2)
'''
w1 is: <tf.Variable 'Variable:0' shape=(2, 3) dtype=float32_ref>
w2 is: <tf.Variable 'Variable_1:0' shape=(3, 1) dtype=float32_ref>
'''
#placeholder机制提供输入数据,程序中不用生成大量常量来提供输入数据,通过placeholder把数据传入TensorFlow中
#数据维度shape可以推导出,不一定给出
x = tf.placeholder(tf.float32, shape = (None,2), name= 'x-input') #x是输入
y_ = tf.placeholder(tf.float32, shape =(None, 1), name= 'y-input') #y_代表正确结果,y代表预测结果

#定义神经网络前向传播过程:输入层x乘以权重w1得到隐藏层a,隐藏层乘以权重w2得到输出层y
a = tf.matmul(x, w1)
y = tf.matmul(a, w2)

#交叉熵损失函数刻画预测值和真实值的差距
cross_entropy = -tf.reduce_mean(y_* tf.log(tf.clip_by_value(y, 1e-10, 1.0))) #reduce_mean均方差,
#反向传播优化算法:
train_step = tf.train.AdamOptimizer(0.001).minimize(cross_entropy)

#随机数生成一个模拟数据集
rdm = RandomState(1)
dataset_size = 128 
X = rdm.rand(dataset_size,2) #2表示X的数组元素有两个值,例如: [  3.17362409e-01   9.88616154e-01]
#x1+x2表示正样本,用1表示,负样本用0表示
Y = [[int(x1+x2<1)] for (x1, x2) in X]
print('X is',X)
print('Y is',Y)

#创建会话运行程序
with tf.Session() as sess:
    init_op = tf.global_variables_initializer()
    sess.run(init_op)

    print(sess.run(w1))
    print(sess.run(w2))

    #设定训练的轮数
    STEPS = 5000
    for i in range(STEPS):
        #每次选取batch——size个样本训练
        start = (i * batch_size) % dataset_size #dataset_size = 128 
        end = min(start + batch_size, dataset_size) #batch_size = 8
        print('start, end are:',start, end)
        '''
        TensorFlow 同样还支持占位符。占位符并没有初始值,它只会分配必要的内存。在会话中,占位符可以使用 feed_dict 馈送数据。
        feed_dict是一个字典,在字典中需要给出每一个用到的占位符的取值。在训练神经网络时需要每次提供一个批量的训练样本,如果每次迭代选取的数据要通过常量表示,那么TensorFlow 的计算图会非常大。因为每增加一个常量,TensorFlow 都会在计算图中增加一个结点。所以说拥有几百万次迭代的神经网络会拥有极其庞大的计算图,而占位符却可以解决这一点,它只会拥有占位符这一个结点。

        '''
        #通过选取的样本训练神经网络并且更新参数,X[start:end]取列表中的值。x:X[start:end]字典中键,值对
        sess.run(train_step, feed_dict = {x:X[start:end], y_: Y[start:end]})
        if i % 1000 ==0:
            #每隔一段时间计算在所有数据上的交叉熵并且输出
            total_cross_entropy  = sess.run(cross_entropy, feed_dict= {x:X, y_:Y})
            print('After %d training steps, cross_entropy on all data is %g'%(i, total_cross_entropy))
    print(sess.run(w1))
    print(sess.run(w2))

'''
如果碰到问题:
RuntimeError: Attempted to use a closed Session.
print(sess.run(w1))这行开始全部往右移动四格就可以了。

正确输出:
initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.
Instructions for updating:
Use `tf.global_variables_initializer` instead.
[[-0.81131822  1.48459876  0.06532937]
 [-2.4427042   0.0992484   0.59122431]]
[[-0.81131822]
 [ 1.48459876]
 [ 0.06532937]]
After 0 training steps, cross_entropy on all data is 0.0674925
After 1000 training steps, cross_entropy on all data is 0.0163385
After 2000 training steps, cross_entropy on all data is 0.00907547
After 3000 training steps, cross_entropy on all data is 0.00714436
After 4000 training steps, cross_entropy on all data is 0.00578471
[[-1.96182752  2.58235407  1.68203771]
 [-3.46817183  1.06982315  2.11788988]]
[[-1.82471502]
 [ 2.68546653]
 [ 1.41819501]]
[Finished in 10.9s]
'''


'''
tf.initialize_all_variables按照要求修改:tf.global_variables_initializer后就没有问题了

X is [[  4.17022005e-01   7.20324493e-01]
 [  1.14374817e-04   3.02332573e-01]
 [  1.46755891e-01   9.23385948e-02]
 [  1.86260211e-01   3.45560727e-01]
 [  3.96767474e-01   5.38816734e-01]
 [  4.19194514e-01   6.85219500e-01]
 [  2.04452250e-01   8.78117436e-01]
 [  2.73875932e-02   6.70467510e-01]
 [  4.17304802e-01   5.58689828e-01]
 [  1.40386939e-01   1.98101489e-01]
 [  8.00744569e-01   9.68261576e-01]
 [  3.13424178e-01   6.92322616e-01]
 [  8.76389152e-01   8.94606664e-01]
 [  8.50442114e-02   3.90547832e-02]
 [  1.69830420e-01   8.78142503e-01]
 [  9.83468338e-02   4.21107625e-01]
 [  9.57889530e-01   5.33165285e-01]
 [  6.91877114e-01   3.15515631e-01]
 [  6.86500928e-01   8.34625672e-01]
 [  1.82882773e-02   7.50144315e-01]
 [  9.88861089e-01   7.48165654e-01]
 [  2.80443992e-01   7.89279328e-01]
 [  1.03226007e-01   4.47893526e-01]
 [  9.08595503e-01   2.93614148e-01]
 [  2.87775339e-01   1.30028572e-01]
 [  1.93669579e-02   6.78835533e-01]
 [  2.11628116e-01   2.65546659e-01]
 [  4.91573159e-01   5.33625451e-02]
 [  5.74117605e-01   1.46728575e-01]
 [  5.89305537e-01   6.99758360e-01]
 [  1.02334429e-01   4.14055988e-01]
 [  6.94400158e-01   4.14179270e-01]
 [  4.99534589e-02   5.35896406e-01]
 [  6.63794645e-01   5.14889112e-01]
 [  9.44594756e-01   5.86555041e-01]
 [  9.03401915e-01   1.37474704e-01]
 [  1.39276347e-01   8.07391289e-01]
 [  3.97676837e-01   1.65354197e-01]
 [  9.27508580e-01   3.47765860e-01]
 [  7.50812103e-01   7.25997985e-01]
 [  8.83306091e-01   6.23672207e-01]
 [  7.50942434e-01   3.48898342e-01]
 [  2.69927892e-01   8.95886218e-01]
 [  4.28091190e-01   9.64840047e-01]
 [  6.63441498e-01   6.21695720e-01]
 [  1.14745973e-01   9.49489259e-01]
 [  4.49912133e-01   5.78389614e-01]
 [  4.08136803e-01   2.37026980e-01]
 [  9.03379521e-01   5.73679487e-01]
 [  2.87032703e-03   6.17144914e-01]
 [  3.26644902e-01   5.27058102e-01]
 [  8.85942099e-01   3.57269760e-01]
 [  9.08535151e-01   6.23360116e-01]
 [  1.58212428e-02   9.29437234e-01]
 [  6.90896918e-01   9.97322850e-01]
 [  1.72340508e-01   1.37135750e-01]
 [  9.32595463e-01   6.96818161e-01]
 [  6.60001727e-02   7.55463053e-01]
 [  7.53876188e-01   9.23024536e-01]
 [  7.11524759e-01   1.24270962e-01]
 [  1.98801338e-02   2.62109869e-02]
 [  2.83064880e-02   2.46211068e-01]
 [  8.60027949e-01   5.38831064e-01]
 [  5.52821979e-01   8.42030892e-01]
 [  1.24173315e-01   2.79183679e-01]
 [  5.85759271e-01   9.69595748e-01]
 [  5.61030219e-01   1.86472894e-02]
 [  8.00632673e-01   2.32974274e-01]
 [  8.07105196e-01   3.87860644e-01]
 [  8.63541855e-01   7.47121643e-01]
 [  5.56240234e-01   1.36455226e-01]
 [  5.99176895e-02   1.21343456e-01]
 [  4.45518785e-02   1.07494129e-01]
 [  2.25709339e-01   7.12988980e-01]
 [  5.59716982e-01   1.25559802e-02]
 [  7.19742797e-02   9.67276330e-01]
 [  5.68100462e-01   2.03293235e-01]
 [  2.52325745e-01   7.43825854e-01]
 [  1.95429481e-01   5.81358927e-01]
 [  9.70019989e-01   8.46828801e-01]
 [  2.39847759e-01   4.93769714e-01]
 [  6.19955718e-01   8.28980900e-01]
 [  1.56791395e-01   1.85762022e-02]
 [  7.00221437e-02   4.86345111e-01]
 [  6.06329462e-01   5.68851437e-01]
 [  3.17362409e-01   9.88616154e-01]
 [  5.79745219e-01   3.80141173e-01]
 [  5.50948219e-01   7.45334431e-01]
 [  6.69232893e-01   2.64919558e-01]
 [  6.63348344e-02   3.70084198e-01]
 [  6.29717507e-01   2.10174010e-01]
 [  7.52755554e-01   6.65364814e-02]
 [  2.60315099e-01   8.04754564e-01]
 [  1.93434283e-01   6.39460881e-01]
 [  5.24670309e-01   9.24807970e-01]
 [  2.63296770e-01   6.59610907e-02]
 [  7.35065963e-01   7.72178030e-01]
 [  9.07815853e-01   9.31972069e-01]
 [  1.39515730e-02   2.34362086e-01]
 [  6.16778357e-01   9.49016321e-01]
 [  9.50176119e-01   5.56653188e-01]
 [  9.15606350e-01   6.41566209e-01]
 [  3.90007714e-01   4.85990667e-01]
 [  6.04310483e-01   5.49547922e-01]
 [  9.26181427e-01   9.18733436e-01]
 [  3.94875613e-01   9.63262528e-01]
 [  1.73955667e-01   1.26329519e-01]
 [  1.35079158e-01   5.05662166e-01]
 [  2.15248053e-02   9.47970211e-01]
 [  8.27115471e-01   1.50189807e-02]
 [  1.76196256e-01   3.32063574e-01]
 [  1.30996845e-01   8.09490692e-01]
 [  3.44736653e-01   9.40107482e-01]
 [  5.82014180e-01   8.78831984e-01]
 [  8.44734445e-01   9.05392319e-01]
 [  4.59880266e-01   5.46346816e-01]
 [  7.98603591e-01   2.85718852e-01]
 [  4.90253523e-01   5.99110308e-01]
 [  1.55332756e-02   5.93481408e-01]
 [  4.33676349e-01   8.07360529e-01]
 [  3.15244803e-01   8.92888709e-01]
 [  5.77857215e-01   1.84010202e-01]
 [  7.87929234e-01   6.12031177e-01]
 [  5.39092721e-02   4.20193680e-01]
 [  6.79068837e-01   9.18601778e-01]
 [  4.02024891e-04   9.76759149e-01]
 [  3.76580315e-01   9.73783538e-01]
 [  6.04716101e-01   8.28845808e-01]]
Y is [[0], [1], [1], [1], [1], [0], [0], [1], [1], [1], [0], [0], [0], [1], [0], [1], [0], [0], [0], [1], [0], [0], [1], [0], [1], [1], [1], [1], [1], [0], [1], [0], [1], [0], [0], [0], [1], [1], [0], [0], [0], [0], [0], [0], [0], [0], [0], [1], [0], [1], [1], [0], [0], [1], [0], [1], [0], [1], [0], [1], [1], [1], [0], [0], [1], [0], [1], [0], [0], [0], [1], [1], [1], [1], [1], [0], [1], [1], [1], [0], [1], [0], [1], [1], [0], [0], [1], [0], [1], [1], [1], [1], [0], [1], [0], [1], [0], [0], [1], [0], [0], [0], [1], [0], [0], [0], [1], [1], [1], [1], [1], [1], [0], [0], [0], [0], [0], [0], [1], [0], [0], [1], [0], [1], [0], [1], [0], [0]]
2018-06-25 14:51:29.591003: I C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\35\tensorflow\core\platform\cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX AVX2
[[-0.81131822  1.48459876  0.06532937]
 [-2.4427042   0.0992484   0.59122431]]
[[-0.81131822]
 [ 1.48459876]
 [ 0.06532937]]
After 0 training steps, cross_entropy on all data is 0.0674925
After 1000 training steps, cross_entropy on all data is 0.0163385
After 2000 training steps, cross_entropy on all data is 0.00907547
After 3000 training steps, cross_entropy on all data is 0.00714436
After 4000 training steps, cross_entropy on all data is 0.00578471
[[-1.96182752  2.58235407  1.68203771]
 [-3.46817183  1.06982315  2.11788988]]
[[-1.82471502]
 [ 2.68546653]
 [ 1.41819501]]
[Finished in 16.4s]
'''


'''
X is [[  4.17022005e-01   7.20324493e-01]
 [  1.14374817e-04   3.02332573e-01]
 [  1.46755891e-01   9.23385948e-02]
 [  1.86260211e-01   3.45560727e-01]
 [  3.96767474e-01   5.38816734e-01]
 [  4.19194514e-01   6.85219500e-01]
 [  2.04452250e-01   8.78117436e-01]
 [  2.73875932e-02   6.70467510e-01]
 [  4.17304802e-01   5.58689828e-01]
 [  1.40386939e-01   1.98101489e-01]
 [  8.00744569e-01   9.68261576e-01]
 [  3.13424178e-01   6.92322616e-01]
 [  8.76389152e-01   8.94606664e-01]
 [  8.50442114e-02   3.90547832e-02]
 [  1.69830420e-01   8.78142503e-01]
 [  9.83468338e-02   4.21107625e-01]
 [  9.57889530e-01   5.33165285e-01]
 [  6.91877114e-01   3.15515631e-01]
 [  6.86500928e-01   8.34625672e-01]
 [  1.82882773e-02   7.50144315e-01]
 [  9.88861089e-01   7.48165654e-01]
 [  2.80443992e-01   7.89279328e-01]
 [  1.03226007e-01   4.47893526e-01]
 [  9.08595503e-01   2.93614148e-01]
 [  2.87775339e-01   1.30028572e-01]
 [  1.93669579e-02   6.78835533e-01]
 [  2.11628116e-01   2.65546659e-01]
 [  4.91573159e-01   5.33625451e-02]
 [  5.74117605e-01   1.46728575e-01]
 [  5.89305537e-01   6.99758360e-01]
 [  1.02334429e-01   4.14055988e-01]
 [  6.94400158e-01   4.14179270e-01]
 [  4.99534589e-02   5.35896406e-01]
 [  6.63794645e-01   5.14889112e-01]
 [  9.44594756e-01   5.86555041e-01]
 [  9.03401915e-01   1.37474704e-01]
 [  1.39276347e-01   8.07391289e-01]
 [  3.97676837e-01   1.65354197e-01]
 [  9.27508580e-01   3.47765860e-01]
 [  7.50812103e-01   7.25997985e-01]
 [  8.83306091e-01   6.23672207e-01]
 [  7.50942434e-01   3.48898342e-01]
 [  2.69927892e-01   8.95886218e-01]
 [  4.28091190e-01   9.64840047e-01]
 [  6.63441498e-01   6.21695720e-01]
 [  1.14745973e-01   9.49489259e-01]
 [  4.49912133e-01   5.78389614e-01]
 [  4.08136803e-01   2.37026980e-01]
 [  9.03379521e-01   5.73679487e-01]
 [  2.87032703e-03   6.17144914e-01]
 [  3.26644902e-01   5.27058102e-01]
 [  8.85942099e-01   3.57269760e-01]
 [  9.08535151e-01   6.23360116e-01]
 [  1.58212428e-02   9.29437234e-01]
 [  6.90896918e-01   9.97322850e-01]
 [  1.72340508e-01   1.37135750e-01]
 [  9.32595463e-01   6.96818161e-01]
 [  6.60001727e-02   7.55463053e-01]
 [  7.53876188e-01   9.23024536e-01]
 [  7.11524759e-01   1.24270962e-01]
 [  1.98801338e-02   2.62109869e-02]
 [  2.83064880e-02   2.46211068e-01]
 [  8.60027949e-01   5.38831064e-01]
 [  5.52821979e-01   8.42030892e-01]
 [  1.24173315e-01   2.79183679e-01]
 [  5.85759271e-01   9.69595748e-01]
 [  5.61030219e-01   1.86472894e-02]
 [  8.00632673e-01   2.32974274e-01]
 [  8.07105196e-01   3.87860644e-01]
 [  8.63541855e-01   7.47121643e-01]
 [  5.56240234e-01   1.36455226e-01]
 [  5.99176895e-02   1.21343456e-01]
 [  4.45518785e-02   1.07494129e-01]
 [  2.25709339e-01   7.12988980e-01]
 [  5.59716982e-01   1.25559802e-02]
 [  7.19742797e-02   9.67276330e-01]
 [  5.68100462e-01   2.03293235e-01]
 [  2.52325745e-01   7.43825854e-01]
 [  1.95429481e-01   5.81358927e-01]
 [  9.70019989e-01   8.46828801e-01]
 [  2.39847759e-01   4.93769714e-01]
 [  6.19955718e-01   8.28980900e-01]
 [  1.56791395e-01   1.85762022e-02]
 [  7.00221437e-02   4.86345111e-01]
 [  6.06329462e-01   5.68851437e-01]
 [  3.17362409e-01   9.88616154e-01]
 [  5.79745219e-01   3.80141173e-01]
 [  5.50948219e-01   7.45334431e-01]
 [  6.69232893e-01   2.64919558e-01]
 [  6.63348344e-02   3.70084198e-01]
 [  6.29717507e-01   2.10174010e-01]
 [  7.52755554e-01   6.65364814e-02]
 [  2.60315099e-01   8.04754564e-01]
 [  1.93434283e-01   6.39460881e-01]
 [  5.24670309e-01   9.24807970e-01]
 [  2.63296770e-01   6.59610907e-02]
 [  7.35065963e-01   7.72178030e-01]
 [  9.07815853e-01   9.31972069e-01]
 [  1.39515730e-02   2.34362086e-01]
 [  6.16778357e-01   9.49016321e-01]
 [  9.50176119e-01   5.56653188e-01]
 [  9.15606350e-01   6.41566209e-01]
 [  3.90007714e-01   4.85990667e-01]
 [  6.04310483e-01   5.49547922e-01]
 [  9.26181427e-01   9.18733436e-01]
 [  3.94875613e-01   9.63262528e-01]
 [  1.73955667e-01   1.26329519e-01]
 [  1.35079158e-01   5.05662166e-01]
 [  2.15248053e-02   9.47970211e-01]
 [  8.27115471e-01   1.50189807e-02]
 [  1.76196256e-01   3.32063574e-01]
 [  1.30996845e-01   8.09490692e-01]
 [  3.44736653e-01   9.40107482e-01]
 [  5.82014180e-01   8.78831984e-01]
 [  8.44734445e-01   9.05392319e-01]
 [  4.59880266e-01   5.46346816e-01]
 [  7.98603591e-01   2.85718852e-01]
 [  4.90253523e-01   5.99110308e-01]
 [  1.55332756e-02   5.93481408e-01]
 [  4.33676349e-01   8.07360529e-01]
 [  3.15244803e-01   8.92888709e-01]
 [  5.77857215e-01   1.84010202e-01]
 [  7.87929234e-01   6.12031177e-01]
 [  5.39092721e-02   4.20193680e-01]
 [  6.79068837e-01   9.18601778e-01]
 [  4.02024891e-04   9.76759149e-01]
 [  3.76580315e-01   9.73783538e-01]
 [  6.04716101e-01   8.28845808e-01]]
Y is [[0], [1], [1], [1], [1], [0], [0], [1], [1], [1], [0], [0], [0], [1], [0], [1], [0], [0], [0], [1], [0], [0], [1], [0], [1], [1], [1], [1], [1], [0], [1], [0], [1], [0], [0], [0], [1], [1], [0], [0], [0], [0], [0], [0], [0], [0], [0], [1], [0], [1], [1], [0], [0], [1], [0], [1], [0], [1], [0], [1], [1], [1], [0], [0], [1], [0], [1], [0], [0], [0], [1], [1], [1], [1], [1], [0], [1], [1], [1], [0], [1], [0], [1], [1], [0], [0], [1], [0], [1], [1], [1], [1], [0], [1], [0], [1], [0], [0], [1], [0], [0], [0], [1], [0], [0], [0], [1], [1], [1], [1], [1], [1], [0], [0], [0], [0], [0], [0], [1], [0], [0], [1], [0], [1], [0], [1], [0], [0]]
2018-06-25 15:03:28.841999: I C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\35\tensorflow\core\platform\cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX AVX2
[[-0.81131822  1.48459876  0.06532937]
 [-2.4427042   0.0992484   0.59122431]]
[[-0.81131822]
 [ 1.48459876]
 [ 0.06532937]]
start, end are: 0 8
After 0 training steps, cross_entropy on all data is 0.0674925
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 72
start, end are: 72 80
start, end are: 80 88
start, end are: 88 96
start, end are: 96 104
start, end are: 104 112
start, end are: 112 120
start, end are: 120 128
start, end are: 0 8
start, end are: 8 16
start, end are: 16 24
start, end are: 24 32
start, end are: 32 40
start, end are: 40 48
start, end are: 48 56
start, end are: 56 64
start, end are: 64 7
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值