010-lec-003. dropout,ensemble
# @
# many layers can cause error in train dataset lower
# many layers can cause error in test dataset high
# That is, the more you have layers, you can have higher chance of overfitting
# As I said before, to resolve overfitting,
# 1. You need to have more train dataset
# 1. You need to reduce number of feautre
# 1. Youo need to perform regularization
$$$cost + \lambda\sum W^{2}$$$
l2regularization=0.001*tf.reduce_sum(tf.squre(W))
# @
# dropout
# When training, you can randomly cut serveral nodes
# When servicing and testing, you should connect all nodes
dropout_rate_placeholder_node=tf.placeholder("float")
_L1_hypothesis_f=tf.nn.relu(tf.add(tf.matmul(X,W1),B1))
L1_hypothesis_f=tf.nn.dropout(_L1_hypothesis_f,dropout_rate_placeholder_node)
# When training, dropout_rate_placeholder_node:0.7 means 70% nodes
sess.run(optimizer,feed_dict={X:batch_xs,Y:batch_ys,dropout_rate_placeholder_node:0.7})
# When testing, dropout_rate_placeholder_node:1 means all nodes
print('Accuracy:',accuracy.eval({X:mnist.test.images,Y:mnist.test.labels,dropout_rate_placeholder_node:1}))
# @
# ensemble increases 2% to 4% in precision