1. TF CORE - researchers
2. Library :tf.contrib.learn
3. Computational graph: i.Each node takes zero or more tensors as inputs and produces a tensor as an output. ii.One type of node is a constant, which takes no inputs and it outputs a value it stores internally. xxxx = tf.constant(initial value, [tf.float32]) [ ] default
4. Session object with method run. xxxx = tf.Session() xxx.run()
5. We can build more complicated computations by combining Tensor nodes with operations. Operations are also nodes.
6. Placeholder can be processed as kind of a "lambda" or a function (parameterized value) xxxx = tf.placeholder(tf.float32)
lambda sess.run(a kind of operation, operated feed_dict parameter)
Especially, operation can act like this: adder_node = a + b add_and_triple = adder_node * 3
7. Variables allow us to add trainable parameters to a graph. xxxx = tf.Variable([.3], tf.float32) constructed with a type and initial value. HOWEVER, when you call tf.Variable, variables are not initialized instantly
8. init = tf.global_variables_initializer()
sess.run(init)
init is a handle. Until we call sess.run, the variables are uninitialized
9. loss function = cost function
reduce_sum reduce_xxx is under Reduction in the official document.