The document outlines the schedule for a full-day machine learning course. The morning sessions introduce deep learning and cover a "hello world" machine learning exercise using MNIST data. Feedforward neural networks are also discussed. The afternoon focuses on computer vision with convolutional neural networks, natural language processing, generative models, time series prediction, and deploying machine learning models. Live coding exercises accompany many of the talks to provide hands-on learning.
2. Full Day of Applied AI
Morning
Session 1 Intro to Artificial Intelligence
09:00-09:45 Introduction to Applied AI
09:45-10:00 Coffee and break
Session 2 Live Coding a machine learning app
10:00-10:10 Getting your machine ready for machine learning
10:10-10.20 Training and evaluating the model
10.20-10.50 Improving the model
10.50-11.00 Coffee and break
Session 3 Machine learning in the wild - deployment
11:00-11.15 Coding exercise continued
11:15-11:45 Serving your own machine learning model | Code
11:45-11:55 How to solve problems | interactive exercise
11:55-12:00 Q and A
Lunch
12:00-13:00 Lunch
Afternoon
Session 4 Hello World Deep Learning (MNIST)
13:00-13:15 Deep Learning intro
13:00-13.15 Image recognition and CNNs | Talk |
13:15-13:45 Building your own convolutional neural network | Code |
13:45-14:00 Coffee and break
Session 5 Natural Language Processing
14:00-14.30 Natural language processing | Talk |
14:30-14:45 Working with language | Code |
14:45-15:00 Coffee and break
Session 6 Conversational interfaces and Time Series
14:00-14.20 Conversational interfaces
14:20-14:45 Time Series prediction
14:45-15:00 Coffee and break
Session 7 Generative models and style transfer
16:00-16.30 Generative models | Talk |
16:30-16:45 Trying out GANS and style transfer | Code |
16:45-17:00 Coffee and break
Anton Osika AI Research Engineer Sana Labs AB
anton.osika@gmail.com
Birger Moëll Machine Learning Engineer
birger.moell@gmail.com
3. All images can be represented as a matrix of pixel
values
7. Do I need to learn math in order to work in AI?
It depends. The more math you know the better.
If you know math you can build your own neural
networks and implement machine learning
algorithms published in AI research papers.
But you can start out making use of powerful
libraries for AI technology (Keras, Tensorflow)
without a full mathematical understanding of how
the libraries work.
The best advice is to start doing stuff and learn
the math along the way.
RELU activation function
8. Math of deep learning
Math of deep learning
AI draws math from three different sources, linear algebra, calculus and statistics.
Linear algebra
http://machinelearningmastery.com/linear-algebra-machine-learning/
Calculus
https://www.umiacs.umd.edu/~hal/courses/2013S_ML/math4ml.pdf
Statistics
http://machinelearningmastery.com/crash-course-statistics-machine-learning/
Neural network from scratch
Building a simple neural network from scratch. Walkthrough and experiments.
http://www.wildml.com/2015/09/implementing-a-neural-network-from-scratch/
9. Linear algebra
Linear algebra is the branch of mathematics
concerning vector spaces and linear mappings
between such spaces. It includes the study of
lines, planes, and subspaces, but is also
concerned with properties common to all vector
spaces.
The set of points with coordinates that satisfy a
linear equation forms a hyperplane in an n-
dimensional space. The conditions under which a
set of nhyperplanes intersect in a single point is
an important focus of study in linear algebra.
Such an investigation is initially motivated by a
system of linear equations containing several
unknowns. Such equations are naturally
represented using the formalism of matrices and
vectors.[1][2][3]
10. Calculus
Calculus (from Latin calculus, literally 'small pebble', used for counting and calculations, like
on an abacus)[1] is the mathematical study of continuous change, in the same way that
geometry is the study of shape and algebra is the study of generalizations of arithmetic
operations. It has two major branches, differential calculus (concerning rates of change and
slopes of curves),[2] and integral calculus (concerning accumulation of quantities and the
areas under and between curves).[3] These two branches are related to each other by the
fundamental theorem of calculus. Both branches make use of the fundamental notions of
convergence of infinite sequences and infinite series to a well-defined limit. Generally,
modern calculus is considered to have been developed in the 17th century by Isaac Newton
and Gottfried Leibniz. Today, calculus has widespread uses in science, engineering, and
economics.[4]
Calculus is a part of modern mathematics education. A course in calculus is a gateway to
other, more advanced courses in mathematics devoted to the study of functions and limits,
broadly called mathematical analysis. Calculus has historically been called "the calculus of
infinitesimals", or "infinitesimal calculus". The term calculus (plural calculi) is also used for
naming specific methods of calculation or notation, and even some theories; such as, e.g.,
propositional calculus, Ricci calculus, calculus of variations, lambda calculus, and process
calculus.
11. Statistics
Statistics is a branch of mathematics dealing with the
collection, analysis, interpretation, presentation, and
organization of data.[1][2] In applying statistics to, e.g., a
scientific, industrial, or social problem, it is conventional to
begin with a statistical population or a statistical model
process to be studied. Populations can be diverse topics
such as "all people living in a country" or "every atom
composing a crystal." Statistics deals with all aspects of data
including the planning of data collection in terms of the
design of surveys and experiments.[1]
Two main statistical methods are used in data analysis:
descriptive statistics, which summarize data from a sample
using indexes such as the mean or standard deviation, and
inferential statistics, which draw conclusions from data that
are subject to random variation (e.g., observational errors,
sampling variation).[3] Descriptive statistics are most often
concerned with two sets of properties of a distribution
(sample or population): central tendency (or location) seeks
to characterize the distribution's central or typical value,
while dispersion (or variability) characterizes the extent to
which members of the distribution depart from its center and
each other. Inferences on mathematical statistics are made
under the framework of probability theory, which deals with
the analysis of random phenomena.
12. Technology to use to work with artificial neural networks
Python
The biggest machine learning libraries are built in Python (Tensorflow, Keras). Just use
Python. It’s fantastic.
Here is a notebook with an introduction to python.
13. Python libraries to use for machine learning
Numpy
NumPy is the fundamental package for scientific computing with Python. It contains among other things:
● a powerful N-dimensional array object
● sophisticated (broadcasting) functions
● tools for integrating C/C++ and Fortran code
● useful linear algebra, Fourier transform, and random number capabilities
Besides its obvious scientific uses, NumPy can also be used as an efficient multi-dimensional container of generic data. Arbitrary data-types
can be defined. This allows NumPy to seamlessly and speedily integrate with a wide variety of databases.
NumPy is licensed under the BSD license, enabling reuse with few restrictions.
Numpy is used in AI for building a neural network from scratch among other things.
Here is a notebook with an introduction to numpy
14. Python libraries for machine learning
Pandas
● Pandas is a library for working with arrays
● Creates dataframes for working with high dimensional arrays
● Great for working with CSV files
● Here is a notebook for working with Pandas
Scikitlearn
● Simple and efficient tools for data mining and data analysis
● Built on NumPy, SciPy, and matplotlib
Matplotlib
● Visualisation library for statistic
15. Python libraries for machine learning
Jupyter notebooks
● Awesome interactive python notebooks running on your local machine
Installation
http://jupyter.org/
Here is a huge list of Jupyter Notebooks used in data science.
https://github.com/jupyter/jupyter/wiki/A-gallery-of-interesting-Jupyter-Notebooks
All our examples are stored in Jupyter Notebooks.
17. Tensorflow
What is Tensorflow?
Tensorflow is Google Machine Learning library. It is open sourced and an incredible percentage of the progress made in AI in
the last years has been done using Tensorflow. It is the de facto standard in machine learning. Google is betting all it’s weight
on AI becoming the future of tomorrow and open sourcing Tensorflow is Googles attempts to become the leader in the AI field.
Currently it is working really well and Google is seen by many as the leader of AI.
Installation
https://www.tensorflow.org/install/
Training your first model in Tensorflow
MNISTS for beginners
https://www.tensorflow.org/get_started/mnist/beginners
MNISTS for experts
https://www.tensorflow.org/get_started/mnist/pros
18. Keras
What is Keras?
Keras is an API on top of Tensorflow that makes it much easier to get started with machine learning.
How do I get started with Keras?
The best way to get started is either with a tutorial or by exploring your own datasets.
Keras
https://keras.io/
Keras machine learning examples
https://github.com/fchollet/keras/tree/master/examples
Keras tutorials
https://github.com/bcarlyle/Momentum-AI-machine-learning-
course/blob/master/lesson1/Getting%20started%20with%20AI.ipynb
19. Pytorch
PyTorch is an open-source machine learning library for Python, based on
Torch,[1][2][3] used for applications such as natural language
processing.[4] It is primarily developed by Facebook's artificial-intelligence
research group,[5][6][7] and Uber's "Pyro" software for probabilistic
programming is built on it.[8]
PyTorch provides two high-level features:[9]
● Tensor computation (like NumPy) with strong GPU acceleration
● Deep neural networks built on a tape-based autodiff system
Learn more
https://pytorch.org/
22. The most mature part of deep learning
Convolutional neural
networks perform
at superhuman
levels on many
image classification
tasks
23. Hello world in machine learning
Open up the notebook Code for MNIST. Explanation of MNIST.ipynb to get started training your first
neural network for classifying digits.
28. Improved Hello world in machine learning
Open up the notebook Code for MNIST. Explanation of MNIST.ipynb to get started training your first
neural network for classifying digits.
29. Lets get started coding
Open up the folder called Computer Vision
to train your own deep neural network for
image recognition.
30. ML / Deep Learning lecture
Birger Moëll Machine Learning Engineer Ayond AB
birger.moell@ayond.se
Morning
Session 1 Intro to deep learning
09:00-09:30 Introduction to Machine learning / Deep Learning | Talk |
09:30-09:45 Getting your machines ready for machine learning | Code |
09:45-10:00 Coffee and break
Session 2 Hello world
10:00-10:15 Hello World in Machine Learning (MNIST) | Talk |
10:15-10:45 Running your own MNIST | Code |
10:45-11:00 Coffee and break
Session 3 Feedforward networks
11:00-11.15 Feedforward Neural Networks | Talk |
11.15-11.45 Building your own feedforward neural network | Code |
11:45-12:00 Q and A | Interactive
Lunch
12:00-13:00 Lunch
Afternoon
Session 4 Image recognition
13:00-13.15 Image recognition and CNNs | Talk |
13:15-13:45 Building your own convolutional neural network | Code |
13:45-14:00 Coffee and break
Session 5 Natural Language Processing
14:00-14.15 Natural language processing | Talk |
14:15-14:45 Working with language | Code |
14:45-15:00 Coffee and break
Session 6-7 Generative models and time series
15:00-15.15 Generative models and LSTMs | Talk |
15:15-15:45 Trying out GANS and time series | Code |
15:45-16:00 Coffee and break
Session 8 Machine learning in the wild / Deployment
16:00-16.15 Machine learning in the wild | Talk |
16:15-16:45 Serving your own machine learning model | Code |
16:45-17:00 Q and A | Interactive
Editor's Notes
Birger
Solutions, split into test and training and validation.
Dropout.