A workshop introducing the TensorFlow Machine Learning framework. Presented by Brenton Chu, Vice President of Machine Learning at Berkeley.
This presentation cover show to construct, train, evaluate, and visualize neural networks in TensorFlow 1.0
http://ml.berkeley.edu
2. Agenda
Our goals for tonight
Neural Network Review
What is TensorFlow?
Building Neural Nets
TensorBoard Visualization
1
2
3
4
3. Neural Networks Review
● Layers that combine previous features to form new features
● Each layer applies linear transformation, “weighing” previous features
● Nonlinear activation function
● Predicts through a feedforward propagation
● Learns through gradient descent and backpropagation
FEEDFORWARD
BACKPROPAGATE
4. Neural Network Libraries
Pros:
- Strength in CNNs
- Image processing
- Python interface
Cons:
- Inflexible
- C++
Pros:
- Widely used
- High performance
- Python
Cons:
- Somewhat bulky
- Can get low-level
Pros:
- Slimmer
- High performance
- Modular
Cons:
- Academic use
- Lua
Pros:
- Gaining support
- TensorBoard
- Python
Cons:
- Improving
performance
- Low content
5. Why Learn TensorFlow?
➢ Backed by Google
○ Constant development and frequent updates
○ DeepMind moving from Torch to TensorFlow
➢ Growing Community
○ Amount of example code and tutorials growing
○ Most commonly mentioned ML library on Stack Overflow
➢ Long term support
○ Recent TensorFlow 1.0 update, all code will be compatible
with 1.x updates
➢ Performance is not very good, but getting better.
○ About an order of magnitude slower than Theano
6. Tensors (side note)
➢ For Programmers: Tensors generalize multidimensional arrays.
➢ For Mathematicians: Tensors generalize scalars, vectors,
matrices and linear operators!
➢ TensorFlow describe data as tensors, and pass them through its
computation graph.
➢ Tensors flow through the network
8. TensorFlow Basics *
Variables
➢ Stores parameters in graph
➢ Can be trainable (optimized
during backprop) or untrainable
➢ Variety of initializers (e.g.
constant, normal, etc)
Operations
➢ Takes in variable and/or
outputs from other operations
➢ Can be fed into other ops and
linked in the graph
tf.constant(5.0)
tf.constant(3.0)
tf.random_normal
(mean=1, stddev=2)
tf.add()
tf.mul()
9. TensorFlow Basics *
Sessions
➢ Handles post-construction interactions with the graph
➢ Call the run method to evaluate tensors
3.0
5.0
1.68
8.0
13.44
tf.constant(5.0)
tf.constant(3.0)
tf.random_normal
(mean=1, stddev=2)
tf.add()
tf.mul()
sess = tf.Session()
sess.run(tf.global_variables_initializer())
sess.run(mult_op)
mult_op
10. TensorFlow Basics *
Optimizers
➢ Subclasses of tf.train.Optimizer
➢ Main functions: compute_gradients, apply_gradients, and
minimize
def minimize(self, loss_fn):
self.compute_gradients(loss_fn
)
self.apply_gradients(loss_fn)
Backpropagation on ops
Update trainable variables
Some loss functions are built into TensorFlow
➢ For example, tf.losses.mean_squared_error
➢ You can also define your own loss functions by combining ops
11. TensorFlow Basics
Placeholders
➢ A placeholder variable that can be filled in during execution
➢ On evaluation, specify a dictionary with placeholder key-value pairs
P2
P1
tf.random_normal
(mean=1,
stddev=2)
tf.add()
tf.mul()
1.0
2.0
0.94
3.0
2.82
sess.run(mul, feed_dict={P1: 1.0,
P2:2.0})
12. TensorBoard
Graph Visualization
➢ See visual representation of the graph
➢ Check and debug construction
Scoping
➢ Used to create abstractions
➢ Without scoping, graph can become a
convoluted mess
➢ Created by using a with statement
➢ Scope gets name prepended to variable
and operation names
14. Review
➢ TensorFlow is one of several ML libraries, each with pros and cons
➢ Expected long term support for TensorFlow
➢ Two stages: construction and execution
➢ Tensors are passed through chained operations
➢ Operations are evaluated at the execution stage with a session object
➢ Use optimizers to find and apply gradients for the training step
➢ TensorBoard used for graph visualization and visualizing learning
15. Thank You for Coming!Please fill out this feedback form:
https://mlab.typeform.com/to/t51Y09
Like our page on Facebook: www.facebook.com/berkeleyml
Email us: ml.at.berkeley@gmail.com
See our website: ml.berkeley.edu
Check out our blog: ml.berkeley.edu/blog