老熟女激烈的高潮_日韩一级黄色录像_亚洲1区2区3区视频_精品少妇一区二区三区在线播放_国产欧美日产久久_午夜福利精品导航凹凸

重慶分公司,新征程啟航

為企業(yè)提供網(wǎng)站建設(shè)、域名注冊(cè)、服務(wù)器等服務(wù)

Tensorflow怎么訓(xùn)練MNIST手寫(xiě)數(shù)字識(shí)別模型-創(chuàng)新互聯(lián)

今天就跟大家聊聊有關(guān)Tensorflow怎么訓(xùn)練MNIST手寫(xiě)數(shù)字識(shí)別模型,可能很多人都不太了解,為了讓大家更加了解,小編給大家總結(jié)了以下內(nèi)容,希望大家根據(jù)這篇文章可以有所收獲。

成都創(chuàng)新互聯(lián)公司主要從事網(wǎng)站建設(shè)、網(wǎng)站設(shè)計(jì)、網(wǎng)頁(yè)設(shè)計(jì)、企業(yè)做網(wǎng)站、公司建網(wǎng)站等業(yè)務(wù)。立足成都服務(wù)赫章,10余年網(wǎng)站建設(shè)經(jīng)驗(yàn),價(jià)格優(yōu)惠、服務(wù)專(zhuān)業(yè),歡迎來(lái)電咨詢(xún)建站服務(wù):18980820575

import tensorflow as tffrom tensorflow.examples.tutorials.mnist import input_dataINPUT_NODE = 784  # 輸入層節(jié)點(diǎn)=圖片像素=28x28=784OUTPUT_NODE = 10  # 輸出層節(jié)點(diǎn)數(shù)=圖片類(lèi)別數(shù)目LAYER1_NODE = 500  # 隱藏層節(jié)點(diǎn)數(shù),只有一個(gè)隱藏層BATCH_SIZE = 100  # 一個(gè)訓(xùn)練包中的數(shù)據(jù)個(gè)數(shù),數(shù)字越小          # 越接近隨機(jī)梯度下降,越大越接近梯度下降LEARNING_RATE_BASE = 0.8   # 基礎(chǔ)學(xué)習(xí)率LEARNING_RATE_DECAY = 0.99  # 學(xué)習(xí)率衰減率REGULARIZATION_RATE = 0.0001  # 正則化項(xiàng)系數(shù)TRAINING_STEPS = 30000     # 訓(xùn)練輪數(shù)MOVING_AVG_DECAY = 0.99    # 滑動(dòng)平均衰減率# 定義一個(gè)輔助函數(shù),給定神經(jīng)網(wǎng)絡(luò)的輸入和所有參數(shù),計(jì)算神經(jīng)網(wǎng)絡(luò)的前向傳播結(jié)果def inference(input_tensor, avg_class, weights1, biases1,       weights2, biases2): # 當(dāng)沒(méi)有提供滑動(dòng)平均類(lèi)時(shí),直接使用參數(shù)當(dāng)前取值 if avg_class == None:  # 計(jì)算隱藏層前向傳播結(jié)果  layer1 = tf.nn.relu(tf.matmul(input_tensor, weights1) + biases1)  # 計(jì)算輸出層前向傳播結(jié)果  return tf.matmul(layer1, weights2) + biases2 else:  # 首先計(jì)算變量的滑動(dòng)平均值,然后計(jì)算前向傳播結(jié)果  layer1 = tf.nn.relu(    tf.matmul(input_tensor, avg_class.average(weights1)) +    avg_class.average(biases1))  return tf.matmul(    layer1, avg_class.average(weights2)) + avg_class.average(biases2)# 訓(xùn)練模型的過(guò)程def train(mnist): x = tf.placeholder(tf.float32, [None, INPUT_NODE], name='x-input') y_ = tf.placeholder(tf.float32, [None, OUTPUT_NODE], name='y-input') # 生成隱藏層參數(shù) weights1 = tf.Variable(   tf.truncated_normal([INPUT_NODE, LAYER1_NODE], stddev=0.1)) biases1 = tf.Variable(tf.constant(0.1, shape=[LAYER1_NODE])) # 生成輸出層參數(shù) weights2 = tf.Variable(   tf.truncated_normal([LAYER1_NODE, OUTPUT_NODE], stddev=0.1)) biases2 = tf.Variable(tf.constant(0.1, shape=[OUTPUT_NODE])) # 計(jì)算前向傳播結(jié)果,不使用參數(shù)滑動(dòng)平均值 avg_class=None y = inference(x, None, weights1, biases1, weights2, biases2) # 定義訓(xùn)練輪數(shù)變量,指定為不可訓(xùn)練 global_step = tf.Variable(0, trainable=False) # 給定滑動(dòng)平均衰減率和訓(xùn)練輪數(shù)的變量,初始化滑動(dòng)平均類(lèi) variable_avgs = tf.train.ExponentialMovingAverage(   MOVING_AVG_DECAY, global_step) # 在所有代表神經(jīng)網(wǎng)絡(luò)參數(shù)的可訓(xùn)練變量上使用滑動(dòng)平均 variables_avgs_op = variable_avgs.apply(tf.trainable_variables()) # 計(jì)算使用滑動(dòng)平均值后的前向傳播結(jié)果 avg_y = inference(x, variable_avgs, weights1, biases1, weights2, biases2) # 計(jì)算交叉熵作為損失函數(shù) cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(   logits=y, labels=tf.argmax(y_, 1)) cross_entropy_mean = tf.reduce_mean(cross_entropy) # 計(jì)算L2正則化損失函數(shù) regularizer = tf.contrib.layers.l2_regularizer(REGULARIZATION_RATE) regularization = regularizer(weights1) + regularizer(weights2) loss = cross_entropy_mean + regularization # 設(shè)置指數(shù)衰減的學(xué)習(xí)率 learning_rate = tf.train.exponential_decay(   LEARNING_RATE_BASE,   global_step,              # 當(dāng)前迭代輪數(shù)   mnist.train.num_examples / BATCH_SIZE, # 過(guò)完所有訓(xùn)練數(shù)據(jù)的迭代次數(shù)   LEARNING_RATE_DECAY) # 優(yōu)化損失函數(shù) train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(   loss, global_step=global_step) # 反向傳播同時(shí)更新神經(jīng)網(wǎng)絡(luò)參數(shù)及其滑動(dòng)平均值 with tf.control_dependencies([train_step, variables_avgs_op]):  train_op = tf.no_op(name='train') # 檢驗(yàn)使用了滑動(dòng)平均模型的神經(jīng)網(wǎng)絡(luò)前向傳播結(jié)果是否正確 correct_prediction = tf.equal(tf.argmax(avg_y, 1), tf.argmax(y_, 1)) accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32)) # 初始化會(huì)話(huà)并開(kāi)始訓(xùn)練 with tf.Session() as sess:  tf.global_variables_initializer().run()  # 準(zhǔn)備驗(yàn)證數(shù)據(jù),用于判斷停止條件和訓(xùn)練效果  validate_feed = {x: mnist.validation.images,          y_: mnist.validation.labels}  # 準(zhǔn)備測(cè)試數(shù)據(jù),用于模型優(yōu)劣的最后評(píng)價(jià)標(biāo)準(zhǔn)  test_feed = {x: mnist.test.images, y_: mnist.test.labels}  # 迭代訓(xùn)練神經(jīng)網(wǎng)絡(luò)  for i in range(TRAINING_STEPS):   if i%1000 == 0:    validate_acc = sess.run(accuracy, feed_dict=validate_feed)    print("After %d training step(s), validation accuracy using average "       "model is %g " % (i, validate_acc))   xs, ys = mnist.train.next_batch(BATCH_SIZE)   sess.run(train_op, feed_dict={x: xs, y_: ys})  # 訓(xùn)練結(jié)束后在測(cè)試集上檢測(cè)模型的最終正確率  test_acc = sess.run(accuracy, feed_dict=test_feed)  print("After %d training steps, test accuracy using average model "     "is %g " % (TRAINING_STEPS, test_acc))# 主程序入口def main(argv=None): mnist = input_data.read_data_sets("/tmp/data", one_hot=True) train(mnist)# Tensorflow主程序入口if __name__ == '__main__': tf.app.run()

輸出結(jié)果如下:

Extracting /tmp/data/train-images-idx3-ubyte.gzExtracting /tmp/data/train-labels-idx1-ubyte.gzExtracting /tmp/data/t10k-images-idx3-ubyte.gzExtracting /tmp/data/t10k-labels-idx1-ubyte.gzAfter 0 training step(s), validation accuracy using average model is 0.0462After 1000 training step(s), validation accuracy using average model is 0.9784After 2000 training step(s), validation accuracy using average model is 0.9806After 3000 training step(s), validation accuracy using average model is 0.9798After 4000 training step(s), validation accuracy using average model is 0.9814After 5000 training step(s), validation accuracy using average model is 0.9826After 6000 training step(s), validation accuracy using average model is 0.9828After 7000 training step(s), validation accuracy using average model is 0.9832After 8000 training step(s), validation accuracy using average model is 0.9838After 9000 training step(s), validation accuracy using average model is 0.983After 10000 training step(s), validation accuracy using average model is 0.9836After 11000 training step(s), validation accuracy using average model is 0.9822After 12000 training step(s), validation accuracy using average model is 0.983After 13000 training step(s), validation accuracy using average model is 0.983After 14000 training step(s), validation accuracy using average model is 0.9844After 15000 training step(s), validation accuracy using average model is 0.9832After 16000 training step(s), validation accuracy using average model is 0.9844After 17000 training step(s), validation accuracy using average model is 0.9842After 18000 training step(s), validation accuracy using average model is 0.9842After 19000 training step(s), validation accuracy using average model is 0.9838After 20000 training step(s), validation accuracy using average model is 0.9834After 21000 training step(s), validation accuracy using average model is 0.9828After 22000 training step(s), validation accuracy using average model is 0.9834After 23000 training step(s), validation accuracy using average model is 0.9844After 24000 training step(s), validation accuracy using average model is 0.9838After 25000 training step(s), validation accuracy using average model is 0.9834After 26000 training step(s), validation accuracy using average model is 0.984After 27000 training step(s), validation accuracy using average model is 0.984After 28000 training step(s), validation accuracy using average model is 0.9836After 29000 training step(s), validation accuracy using average model is 0.9842After 30000 training steps, test accuracy using average model is 0.9839

看完上述內(nèi)容,你們對(duì)Tensorflow怎么訓(xùn)練MNIST手寫(xiě)數(shù)字識(shí)別模型有進(jìn)一步的了解嗎?如果還想了解更多知識(shí)或者相關(guān)內(nèi)容,請(qǐng)關(guān)注創(chuàng)新互聯(lián)行業(yè)資訊頻道,感謝大家的支持。


分享標(biāo)題:Tensorflow怎么訓(xùn)練MNIST手寫(xiě)數(shù)字識(shí)別模型-創(chuàng)新互聯(lián)
URL地址:http://www.xueling.net.cn/article/dhhgdj.html

其他資訊

在線(xiàn)咨詢(xún)
服務(wù)熱線(xiàn)
服務(wù)熱線(xiàn):028-86922220
TOP
主站蜘蛛池模板: 极品新婚夜少妇真紧 | 欧美日韩另类视频 | 久久视频亚洲 | 99久久精品费精品 | ▇精东视频▇在线观看视频 | 亚洲综合av一区 | 国产成人一区二区三区在线播放 | 91精品国产高清久久久久久 | 97人妻无码免费专区 | 777影音| 精品人妻VA出轨中文字幕 | 午夜AV无码福利免费看网站 | 国产免费视频传媒 | 国产学生系列一区二区三区 | 亚洲一级毛片 | 精品国产毛片 | 亚洲欧洲日韩国内高清 | 2020国产精品香蕉在线观看 | 国产成人激情视频 | 亚洲国产精品久久久久秋霞 | 日韩wuma| 亚洲九九爱 | 尤物99国产成人精品视频 | 综合爱爱 | 国产一区二区日韩精品 | 久久精品女人天堂AV麻 | 久久精品爱看无码免费视频 | 一本久道综合在线无码人妻 | 国产精品久久国产精麻豆96堂 | 亚洲欧美日韩系列中文字幕 | 国产成人精品一区二区尿失禁 | 人人玩人人添人人澡免费 | 亚洲黄色录像片 | 一级看片免费视频 | 国产私拍精品88福利视频 | 中国农村熟妇性视频 | 18禁网站免费无遮挡无码中文 | 无码AV免费网站 | ktv做爰视频一区二区 | 又硬又粗进去好爽免费 | 特级毛片在线看 |