Date: March 4, 2016
Venue: Trondheim, Norway. Doctoral Seminar at NTNU
Please cite, link to or credit this presentation when using it or part of it in your work.
5. From ML to Deep Learning
Nonlinearity: Neural Network
6. TensorFlow
• Data: tensors
• Graph representation of computations
• Nodes: operators
• States by Variables
• Execution in Sessions
7. TensorFlow
Advanced features
• Construction phase
• Execution phase
import tensorflow as tf
# Create a Constant op that produces a 1x2 matrix. The op is
# added as a node to the default graph.
#
# The value returned by the constructor represents the output
# of the Constant op.
matrix1 = tf.constant([[3., 3.]])
# Create another Constant that produces a 2x1 matrix.
matrix2 = tf.constant([[2.],[2.]])
with tf.Session() as sess:
result = sess.run([product])
print(result)
8. TensorFlow
Advanced features
• Working with Variables
# Create two variables.
weights = tf.Variable(tf.random_normal([784, 200], stddev=0.35),
name="weights")
biases = tf.Variable(tf.zeros([200]), name="biases")
...
# Add an op to initialize the variables.
init_op = tf.initialize_all_variables()
# Later, when launching the model
with tf.Session() as sess:
# Run the init operation.
sess.run(init_op)
...
# Use the model
...
9. TensorFlow
Advanced features
• Graph Visualization
• Using GPUs
• Sharing variables
https://www.tensorflow.org/
with tf.Session() as sess:
with tf.device("/gpu:1"):
matrix1 = tf.constant([[3., 3.]])
matrix2 = tf.constant([[2.],[2.]])