SlideShare a Scribd company logo
1 of 83
Download to read offline
CNN Models
Convolutional Neural Network
Models
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
 Convolutional Neural Network (CNN)is a multi-layer neural
network
 Convolutional Neural Network is comprised of one or more
convolutional layers (often with a pooling layers) and then
followed by one or more fully connected layers.
CNN Models
 Convolutional layer acts as a feature extractor that extracts
features of the inputs such as edges, corners , endpoints.
CNN Models
 Pooling layer reduces the resolution of the image that
reduce the precision of the translation (shift and distortion)
effect.
CNN Models
 fully connected layer have full connections to all activations in
the previous layer.
 Fully connect layer act as classifier.
CNN Models
Output Image =
( ( (ImageSize+2*Padding)- KernalSize )/ Stride) +1
CNN Models
 Conv 3x3 with stride=1,padding=0
6x6 Image
4x4
CNN Models
 Conv 3x3 with stride=1,padding=1
4x4 Image
4x4
CNN Models
 Conv 3x3 with stride=2,padding=0
7x7 Image
3x3
CNN Models
 Conv 3x3 with stride=2,padding=1
5x5 Image
3x3
CNN Models
 MaxPooling 2x2 with stride=2
4x4 Image
2x2
CNN Models
 MaxPooling 3x3 with stride=2
7x7 Image
3x3
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
 ImageNet Large Scale Visual Recognition Challenge
is image classification challenge to create model that
can correctly classify an input image into 1,000 separate
object categories.
Models are trained on 1.2 million training images with
another 50,000 images for validation and 150,000
images for testing
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
 AlexNet achieve on ILSVRC 2012 competition 15.3% Top-5
error rate compare to 26.2% achieved by the second best
entry.
 AlexNet using batch stochastic gradient descent on training,
with specific values for momentum and weight decay.
 AlexNet implement dropout layers in order to combat the
problem of overfitting to the training data.
CNN Models
Image
Conv1
Pool1
Conv2
Pool2
Conv3
Conv4
Conv5
Pool3
FC1
FC2
FC3
 AlexNet has 8 layers without count pooling layers.
 AlexNet use ReLU for the nonlinearity functions
 AlexNet trained on two GTX 580 GPUs for five to six days
CNN Models
Image
227x227x3
Conv11-96 Maxpool Conv5-256
MaxpoolConv3-384Conv3-384Conv3-256
Maxpool FC-4096 FC-4096 FC-1000
CNN Models
 AlexNet Model
CNN Models
 Layer 0: Input image
 Size: 227 x 227 x 3
 Memory: 227 x 227 x 3
CNN Models
 Layer 0: 227 x 227 x 3
 Layer 1: Convolution with 96 filters, size 11×11, stride 4, padding 0
 Outcome Size= 55 x 55 x 96
 (227-11)/4 + 1 = 55 is size of outcome
 Memory: 55 x 55 x 96 x 3 (because of ReLU & LRN(Local Response Normalization))
 Weights (parameters) : 11 x 11 x 3 x 96
CNN Models
 Layer 1: 55 x 55 x 96
 Layer 2: Max-Pooling with 3×3 filter, stride 2
 Outcome Size= 27 x 27 x 96
 (55 – 3)/2 + 1 = 27 is size of outcome
 Memory: 27 x 27 x 96
CNN Models
 Layer 2: 27 x 27 x 96
 Layer 3: Convolution with 256 filters, size 5×5, stride 1, padding 2
 Outcome Size = 27 x 27 x 256
 original size is restored because of padding
 Memory: 27 x 27 x 256 x 3 (because of ReLU and LRN)
 Weights: 5 x 5 x 96 x 256
CNN Models
 Layer 3: 27 x 27 x 256
 Layer 4: Max-Pooling with 3×3 filter, stride 2
 Outcome Size = 13 x 13 x 256
 (27 – 3)/2 + 1 = 13 is size of outcome
 Memory: 13 x 13 x 256
CNN Models
 Layer 4: 13 x 13 x 256
 Layer 5: Convolution with 384 filters, size 3×3, stride 1, padding 1
 Outcome Size = 13 x 13 x 384
 the original size is restored because of padding (13+2 -3)/1 +1 =13
 Memory: 13 x 13 x 384 x 2 (because of ReLU)
 Weights: 3 x 3 x 256 x 384
CNN Models
 Layer 5: 13 x 13 x 384
 Layer 6: Convolution with 384 filters,
size 3×3, stride 1, padding 1
 Outcome Size = 13 x 13 x 384
 the original size is restored because of
padding
 Memory: 13 x 13 x 384 x 2 (because of ReLU)
 Weights: 3 x 3 x 384 x 384
CNN Models
 Layer 6: 13 x 13 x 384
 Layer 7: Convolution with 256 filters, size 3×3, stride 1, padding 1
 Outcome Size = 13 x 13 x 256
 the original size is restored because of padding
 Memory: 13 x 13 x 256 x 2 (because of ReLU)
 Weights: 3 x 3 x 384 x 256
CNN Models
 Layer 7: 13 x 13 x 256
 Layer 8: Max-Pooling with 3×3 filter, stride 2
 Outcome Size = 6 x 6 x 256
 (13 – 3)/2 + 1 = 6 is size of outcome
 Memory: 6 x 6 x 256
CNN Models
 Layer 8: 6x6x256=9216 pixels are fed to FC
 Layer 9: Fully Connected with 4096 neuron
 Memory: 4096 x 3 (because of ReLU and Dropout)
 Weights: 4096 x (6 x 6 x 256)
CNN Models
 Layer 9: Fully Connected with 4096 neuron
 Layer 10: Fully Connected with 4096 neuron
 Memory: 4096 x 3 (because of ReLU and Dropout)
 Weights: 4096 x 4096
CNN Models
 Layer 10: Fully Connected with 4096 neuron
 Layer 11: Fully Connected with 1000 neurons
 Memory: 1000
 Weights: 4096 x 1000
CNN Models
 Total (label and softmax not
included)
 Memory: 2.24 million
 Weights: 62.37 million
CNN Models
 first use of ReLU
 Alexnet used Norm layers
 Alexnet heavy used data augmentation
 Alexnet use dropout 0.5
 Alexnet batch size is 128
 Alexnet used SGD Momentum 0.9
 Alexnet used learning rate 1e-2, reduced by 10
CNN Models
[227x227x3] INPUT
[55x55x96] CONV1 : 96 11x11 filters at stride 4, pad 0
27x27x96] MAX POOL1 : 3x3 filters at stride 2
[27x27x96] NORM1: Normalization layer
[27x27x256] CONV2: 256 5x5 filters at stride 1, pad 2
[13x13x256] MAX POOL2: 3x3 filters at stride 2
[13x13x256] NORM2: Normalization layer
CNN Models
[13x13x384] CONV3: 384 3x3 filters at stride 1, pad 1
[13x13x384] CONV4: 384 3x3 filters at stride 1, pad 1
[13x13x256] CONV5: 256 3x3 filters at stride 1, pad 1
[6x6x256] MAX POOL3: 3x3 filters at stride 2
[4096] FC6: 4096 neurons
[4096] FC7: 4096 neurons
[1000] FC8: 1000 neurons
CNN Models
 Implement AlexNet using TFLearn
CNN Models
CNN Models
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
 ZFNet the winner of the competition ILSVRC 2013 with 14.8%
Top-5 error rate
 ZFNet built by Matthew Zeiler and Rob Fergus
 ZFNet has the same global architecture as Alexnet, that is to say
5 convolutional layers, two fully connected layers and an output
softmax one. The differences are for example better sized
convolutional kernels.
CNN Models
 ZFNet used filters of size 7x7 and a decreased stride value,
instead of using 11x11 sized filters in the first layer (which is what
AlexNet implemented).
 ZFNet trained on a GTX 580 GPU for twelve days.
 Developed a visualization technique named Deconvolutional
Network “deconvnet” because it maps features to pixels.
CNN Models
AlexNet but:
• CONV1: change from (11x11 stride 4) to (7x7 stride 2)
• CONV3,4,5: instead of 384, 384, 256 filters use 512, 1024,
512
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
 Keep it deep. Keep it simple.
 VGGNet the runner up of the competition ILSVRC 2014 with 7.3%
Top-5 error rate.
 VGGNet use of only 3x3 sized filters is quite different from AlexNet’s
11x11 filters in the first layer and ZFNet’s 7x7 filters.
 two 3x3 conv layers have an effective receptive field of 5x5
 Three 3x3 conv layers have an effective receptive field of 7x7
 VGGNet trained on 4 Nvidia Titan Black GPUs for two to three
weeks
CNN Models
 Interesting to notice that the number of filters doubles after each
maxpool layer. This reinforces the idea of shrinking spatial
dimensions, but growing depth.
 VGGNet used scale jittering as one data augmentation technique
during training
 VGGNet used ReLU layers after each conv layer and trained with
batch gradient descent
CNN Models
Image
Conv
Conv
Pool
Conv
Conv
Pool
Conv
Conv
Conv
Pool
Conv
Conv
Conv
Pool
Conv
Conv
Conv
Pool
FC
FC
FC
Image
Low Level
Feature
Mid Level
Feature
High Level
Feature
Classifier
CNN Models
Input
224x224x3
Conv3-64 Conv3-64 Maxpool Conv3-128 Conv3-128
MaxpoolConv3-256Conv3-256Conv3-256MaxpoolConv3-512
Conv3-512 Conv3-512 Maxpool Conv3-512 Conv3-512 Conv3-512
MaxpoolFC-4096FC-4096FC-1000VGGNet 16
CNN Models
VGGNet 16
CNN Models
Input
224x224x3
Conv3-64 Conv3-64 Maxpool Conv3-128 Conv3-128 Maxpool
Conv3-256Conv3-256Conv3-256Conv3-256MaxpoolConv3-512Conv3-512
Conv3-512 Conv3-512 Maxpool Conv3-512 Conv3-512 Conv3-512 Conv3-512
MaxpoolFC-4096FC-4096FC-1000
VGGNet 19
CNN Models
 Implement VGGNet16 using TFLearn
CNN Models
CNN Models
CNN Models
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
 GoogleNet is the winner of the competition ILSVRC 2014 with
6.7% Top-5 error rate.
 GoogleNet Trained on “a few high-end GPUs with in a week”
 GoogleNet uses 12x fewer parameters than AlexNet
 GoogleNet use an average pool instead of fully connected
layers, to go from a 7x7x1024 volume to a 1x1x1024 volume. This
saves a huge number of parameters.
CNN Models
 GoogleNet used 9 Inception modules in the whole architecture
 This 1x1 convolutions (bottleneck convolutions) allow to
control/reduce the depth dimension which greatly reduces the
number of used parameters due to removal of redundancy of
correlated filters.
 GoogleNet has 22 Layers deep network
CNN Models
 GoogleNet use an average pool instead of using FC-Layer, to go
from a 7x7x1024 volume to a 1x1x1024 volume. This saves a
huge number of parameters.
 GoogleNet use inexpensive Conv1 to compute reduction before
the expensive Conv3 and Conv5
 Conv1 follow by Relu to reduce overfitting
CNN Models
 Inception module
CNN Models
Input
224x224x3
Conv7/2-64 Maxpool3/2 Conv1
Conv3/1-
192
Maxpool3/2
Inception3a
256
Inception3b
480
Maxpool3/2
Inception4a
512
Inception4b
512
Inception4c
512
Inception4d
528
Inception4e
832
Maxpool3/2
Inception5a
832
Inception5b
1024
Avgpool7/1
Dropout
40%
FC-1000
Softmax-
1000GoogleNet
CNN Models
CNN Models
Type
Size/
Stride
Output
Depth
Conv1
#
Conv3
Conv3
#
Conv5
Conv5
Pool
Param
Ops
Conv 7x7/2 112x112x64 1 - - - - - - 2.7K 34M
Maxpool 3x3/2 56x56x64 0 - - - - - - - -
Conv 3x3/1 56x56x192 2 - 64 192 - - - 112K 360M
Maxpool 3x3/2 28x28x192 0 - - - - - - - -
Inception 3a - 28x28x256 2 64 96 128 16 32 32 159K 128M
Inception 3b - 28x28x480 2 128 128 192 32 96 64 380K 304M
Maxpool 3x3/2 14x14x480 0 - - - - - - - -
Inception 4a - 14x14x512 2 192 96 208 16 48 64 364K 73M
Inception 4b - 14x14x512 2 160 112 224 24 64 64 437K 88M
Inception 4c - 14x14x512 2 128 128 256 24 64 64 463K 100M
Inception 4d - 14x14x528 2 112 144 288 32 64 64 580K 119M
CNN Models
Type
Size/
Stride
Output
Depth
Conv1
#
Conv3
Conv3
#
Conv5
Conv5
Pool
Param
Ops
Inception 4e - 14x14x832 2 256 160 320 32 128 128 840K 170M
Maxpool 3x3/2 7x7x832 0 - - - - - - - -
Inception 5a - 7x7x832 2 256 160 320 32 128 128 1072K 54M
Inception 5b - 7x7x1024 2 384 192 384 48 128 128 1388K 71M
Avgpool 7x7/1 1x1x1024 0 - - - - - - - -
Dropout .4 - 1x1x1024 0 - - - - - - - -
Linear - 1x1x1024 1 - - - - - - 1000K 1M
Softmax - 1x1x1024 0 - - - - - - - -
Total Layers 22
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
 ResNet the winner of the competition ILSVRC 2015 with 3.6%
Top-5 error rate.
 ResNet mainly inspired by the philosophy of VGGNet.
 ResNet proposed a residual learning approach to ease the
difficulty of training deeper networks. Based on the design ideas
of Batch Normalization (BN), small convolutional kernels.
 ResNet is a new 152 layer network architecture.
 ResNet Trained on an 8 GPU machine for two to three weeks
CNN Models
 Residual network
 Keys:
 No max pooling
 No hidden fc
 No dropout
 Basic design (VGG-style)
 All 3x3 conv (almost)
 Batch normalization
CNN Models
Conv
Layers
Preserving base information
can treat
perturbation
CNN Models
 Residual block
CNN Models
 Residual Bottleneck consist of a
1×1 layer for reducing dimension, a
3×3 layer, and a 1×1 layer for
restoring dimension.
CNN Models
Image
Conv7/2-
64
Pool/2 Conv3-64 Conv3-64 Conv3-64 Conv3-64 Conv3-64
Conv3-64
Conv3/2-
128
Conv3-128Conv3-128Conv3-128Conv3-128Conv3-128Conv3-128
Conv3-128
Conv3/2-
256
Conv3-256 Conv3-256 Conv3-256 Conv3-256 Conv3-256 Conv3-256
Conv3-256Conv3-256Conv3-256Conv3-256Conv3-256
Conv3/2-
512
Conv3-512Conv3-512
Conv3-512 Conv3-512 Conv3-512 Avg pool FC-1000
ResNet 34
CNN Models
Image Conv7/2-64 Pool/2 2Conv3-64 2Conv3-64 2Conv3-64
2Conv3/2-
128
2Conv3-1282Conv3-1282Conv3-128
2Conv3/2-
256
2Conv3-256
2Conv3-256 2Conv3-256 2Conv3-256 2Conv3-256
2Conv3/2-
512
2Conv3-512
2Conv3-512Avg poolFC-1000ResNet 34
CNN Models
 ResNet Model
CNN Models
Layer Output 18-Layer 34-Layer 50-Layer 101-Layer 152-Layer
Conv-1 112x112 7x7/2-64
Conv-2 56x56
3x3 Maxpooling/2
𝟐𝐱
𝟑𝐱𝟑, 𝟔𝟒
𝟑𝐱𝟑, 𝟔𝟒
𝟑𝐱
𝟑𝐱𝟑, 𝟔𝟒
𝟑𝐱𝟑, 𝟔𝟒
𝟑𝐱
𝟏𝐱𝟏, 𝟔𝟒
𝟑𝐱𝟑𝐱𝟔𝟒
𝟏𝐱𝟏𝐱𝟐𝟓𝟔
𝟑𝐱
𝟏𝐱𝟏, 𝟔𝟒
𝟑𝐱𝟑𝐱𝟔𝟒
𝟏𝐱𝟏𝐱𝟐𝟓𝟔
𝟑𝐱
𝟏𝐱𝟏, 𝟔𝟒
𝟑𝐱𝟑𝐱𝟔𝟒
𝟏𝐱𝟏𝐱𝟐𝟓𝟔
Conv-3 28x28 𝟐𝐱
𝟑𝐱𝟑, 𝟏𝟐𝟖
𝟑𝐱𝟑, 𝟏𝟐𝟖
𝟒𝐱
𝟑𝐱𝟑, 𝟏𝟐𝟖
𝟑𝐱𝟑, 𝟏𝟐𝟖
𝟒𝐱
𝟏𝐱𝟏, 𝟏𝟐𝟖
𝟑𝐱𝟑𝐱𝟏𝟐𝟖
𝟏𝐱𝟏𝐱𝟓𝟏𝟐
𝟒𝐱
𝟏𝐱𝟏, 𝟏𝟐𝟖
𝟑𝐱𝟑𝐱𝟏𝟐𝟖
𝟏𝐱𝟏𝐱𝟓𝟏𝟐
𝟖𝐱
𝟏𝐱𝟏, 𝟏𝟐𝟖
𝟑𝐱𝟑𝐱𝟏𝟐𝟖
𝟏𝐱𝟏𝐱𝟓𝟏𝟐
Conv-4 14x14 𝟐𝐱
𝟑𝐱𝟑, 𝟐𝟓𝟔
𝟑𝐱𝟑, 𝟐𝟓𝟔
𝟔𝐱
𝟑𝐱𝟑, 𝟐𝟓𝟔
𝟑𝐱𝟑, 𝟐𝟓𝟔
𝟔𝐱
𝟏𝐱𝟏, 𝟐𝟓𝟔
𝟑𝐱𝟑𝐱𝟐𝟓𝟔
𝟏𝐱𝟏𝐱𝟏𝟎𝟐𝟒
𝟐𝟑𝐱
𝟏𝐱𝟏, 𝟐𝟓𝟔
𝟑𝐱𝟑𝐱𝟐𝟓𝟔
𝟏𝐱𝟏𝐱𝟏𝟎𝟐𝟒
𝟑𝟔𝐱
𝟏𝐱𝟏, 𝟐𝟓𝟔
𝟑𝐱𝟑𝐱𝟐𝟓𝟔
𝟏𝐱𝟏𝐱𝟏𝟎𝟐𝟒
Conv-5 7x7 𝟐𝐱
𝟑𝐱𝟑, 𝟓𝟏𝟐
𝟑𝐱𝟑, 𝟓𝟏𝟐
𝟑𝐱
𝟑𝐱𝟑, 𝟓𝟏𝟐
𝟑𝐱𝟑, 𝟓𝟏𝟐
𝟑𝐱
𝟏𝐱𝟏, 𝟓𝟏𝟐
𝟑𝐱𝟑𝐱𝟓𝟏𝟐
𝟏𝐱𝟏𝐱𝟐𝟎𝟒𝟖
𝟑𝐱
𝟏𝐱𝟏, 𝟓𝟏𝟐
𝟑𝐱𝟑𝐱𝟓𝟏𝟐
𝟏𝐱𝟏𝐱𝟐𝟎𝟒𝟖
𝟑𝐱
𝟏𝐱𝟏, 𝟓𝟏𝟐
𝟑𝐱𝟑𝐱𝟓𝟏𝟐
𝟏𝐱𝟏𝐱𝟐𝟎𝟒𝟖
1x1 Avgpool-FC1000-Softmax
Flops 𝟏. 𝟖𝐱𝟏𝟎 𝟗 𝟑. 𝟔𝐱𝟏𝟎 𝟗 𝟑. 𝟖𝐱𝟏𝟎 𝟗 𝟕. 𝟔𝐱𝟏𝟎 𝟗 𝟏𝟏. 𝟑𝐱𝟏𝟎 𝟗
CNN Models
 Implement ResNet using TFLearn
CNN Models
CNN Models
CNN Models
CNN Models
Convolutional Neural Network
ILSVRC
AlexNet (2012)
ZFNet (2013)
VGGNet (2014)
GoogleNet 2014)
ResNet (2015)
Conclusion
CNN Models
26.2
15.3 14.8
7.3 6.7
3.6
0
5
10
15
20
25
30
Before 2012 AlexNet 2012 ZFNet 2013 VGGNet 2014 GoogleNet 2014 ResNet 2015
CNN Models
CNN Models
facebook.com/mloey
mohamedloey@gmail.com
twitter.com/mloey
linkedin.com/in/mloey
mloey@fci.bu.edu.eg
mloey.github.io
CNN Models
www.YourCompany.com
© 2020 Companyname PowerPoint Business Theme. All Rights Reserved.
THANKS FOR
YOUR TIME

More Related Content

What's hot

Machine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural NetworkMachine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural NetworkRichard Kuo
 
CNN and its applications by ketaki
CNN and its applications by ketakiCNN and its applications by ketaki
CNN and its applications by ketakiKetaki Patwari
 
Image Classification using deep learning
Image Classification using deep learning Image Classification using deep learning
Image Classification using deep learning Asma-AH
 
CNN Machine learning DeepLearning
CNN Machine learning DeepLearningCNN Machine learning DeepLearning
CNN Machine learning DeepLearningAbhishek Sharma
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Basit Rafiq
 
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...Simplilearn
 
Convolutional Neural Network
Convolutional Neural NetworkConvolutional Neural Network
Convolutional Neural NetworkVignesh Suresh
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)Fellowship at Vodafone FutureLab
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural networkMojammilHusain
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Suraj Aavula
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural networkFerdous ahmed
 
ResNet basics (Deep Residual Network for Image Recognition)
ResNet basics (Deep Residual Network for Image Recognition)ResNet basics (Deep Residual Network for Image Recognition)
ResNet basics (Deep Residual Network for Image Recognition)Sanjay Saha
 
Deep Learning in Computer Vision
Deep Learning in Computer VisionDeep Learning in Computer Vision
Deep Learning in Computer VisionSungjoon Choi
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep LearningOswald Campesato
 
Convolutional neural network from VGG to DenseNet
Convolutional neural network from VGG to DenseNetConvolutional neural network from VGG to DenseNet
Convolutional neural network from VGG to DenseNetSungminYou
 

What's hot (20)

Machine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural NetworkMachine Learning - Convolutional Neural Network
Machine Learning - Convolutional Neural Network
 
Cnn
CnnCnn
Cnn
 
CNN and its applications by ketaki
CNN and its applications by ketakiCNN and its applications by ketaki
CNN and its applications by ketaki
 
Resnet
ResnetResnet
Resnet
 
CNN Tutorial
CNN TutorialCNN Tutorial
CNN Tutorial
 
Image Classification using deep learning
Image Classification using deep learning Image Classification using deep learning
Image Classification using deep learning
 
CNN Machine learning DeepLearning
CNN Machine learning DeepLearningCNN Machine learning DeepLearning
CNN Machine learning DeepLearning
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)
 
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
 
Convolutional Neural Network
Convolutional Neural NetworkConvolutional Neural Network
Convolutional Neural Network
 
cnn ppt.pptx
cnn ppt.pptxcnn ppt.pptx
cnn ppt.pptx
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural network
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural network
 
ResNet basics (Deep Residual Network for Image Recognition)
ResNet basics (Deep Residual Network for Image Recognition)ResNet basics (Deep Residual Network for Image Recognition)
ResNet basics (Deep Residual Network for Image Recognition)
 
Deep Learning in Computer Vision
Deep Learning in Computer VisionDeep Learning in Computer Vision
Deep Learning in Computer Vision
 
Cnn method
Cnn methodCnn method
Cnn method
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep Learning
 
Convolutional neural network from VGG to DenseNet
Convolutional neural network from VGG to DenseNetConvolutional neural network from VGG to DenseNet
Convolutional neural network from VGG to DenseNet
 

Viewers also liked

Algorithms Lecture 5: Sorting Algorithms II
Algorithms Lecture 5: Sorting Algorithms IIAlgorithms Lecture 5: Sorting Algorithms II
Algorithms Lecture 5: Sorting Algorithms IIMohamed Loey
 
PMP Lecture 1: Introduction to Project Management
PMP Lecture 1: Introduction to Project ManagementPMP Lecture 1: Introduction to Project Management
PMP Lecture 1: Introduction to Project ManagementMohamed Loey
 
Deep Learning - Overview of my work II
Deep Learning - Overview of my work IIDeep Learning - Overview of my work II
Deep Learning - Overview of my work IIMohamed Loey
 
Computer Security Lecture 5: Simplified Advanced Encryption Standard
Computer Security Lecture 5: Simplified Advanced Encryption StandardComputer Security Lecture 5: Simplified Advanced Encryption Standard
Computer Security Lecture 5: Simplified Advanced Encryption StandardMohamed Loey
 
Algorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms IAlgorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms IMohamed Loey
 
Algorithms Lecture 3: Analysis of Algorithms II
Algorithms Lecture 3: Analysis of Algorithms IIAlgorithms Lecture 3: Analysis of Algorithms II
Algorithms Lecture 3: Analysis of Algorithms IIMohamed Loey
 
Algorithms Lecture 4: Sorting Algorithms I
Algorithms Lecture 4: Sorting Algorithms IAlgorithms Lecture 4: Sorting Algorithms I
Algorithms Lecture 4: Sorting Algorithms IMohamed Loey
 
Computer Security Lecture 7: RSA
Computer Security Lecture 7: RSAComputer Security Lecture 7: RSA
Computer Security Lecture 7: RSAMohamed Loey
 
C++ Programming Language
C++ Programming Language C++ Programming Language
C++ Programming Language Mohamed Loey
 

Viewers also liked (9)

Algorithms Lecture 5: Sorting Algorithms II
Algorithms Lecture 5: Sorting Algorithms IIAlgorithms Lecture 5: Sorting Algorithms II
Algorithms Lecture 5: Sorting Algorithms II
 
PMP Lecture 1: Introduction to Project Management
PMP Lecture 1: Introduction to Project ManagementPMP Lecture 1: Introduction to Project Management
PMP Lecture 1: Introduction to Project Management
 
Deep Learning - Overview of my work II
Deep Learning - Overview of my work IIDeep Learning - Overview of my work II
Deep Learning - Overview of my work II
 
Computer Security Lecture 5: Simplified Advanced Encryption Standard
Computer Security Lecture 5: Simplified Advanced Encryption StandardComputer Security Lecture 5: Simplified Advanced Encryption Standard
Computer Security Lecture 5: Simplified Advanced Encryption Standard
 
Algorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms IAlgorithms Lecture 2: Analysis of Algorithms I
Algorithms Lecture 2: Analysis of Algorithms I
 
Algorithms Lecture 3: Analysis of Algorithms II
Algorithms Lecture 3: Analysis of Algorithms IIAlgorithms Lecture 3: Analysis of Algorithms II
Algorithms Lecture 3: Analysis of Algorithms II
 
Algorithms Lecture 4: Sorting Algorithms I
Algorithms Lecture 4: Sorting Algorithms IAlgorithms Lecture 4: Sorting Algorithms I
Algorithms Lecture 4: Sorting Algorithms I
 
Computer Security Lecture 7: RSA
Computer Security Lecture 7: RSAComputer Security Lecture 7: RSA
Computer Security Lecture 7: RSA
 
C++ Programming Language
C++ Programming Language C++ Programming Language
C++ Programming Language
 

Similar to Convolutional Neural Network Models - Deep Learning

Lecture 5: Convolutional Neural Network Models
Lecture 5: Convolutional Neural Network ModelsLecture 5: Convolutional Neural Network Models
Lecture 5: Convolutional Neural Network ModelsMohamed Loey
 
Deep Learning for Computer Vision: Memory usage and computational considerati...
Deep Learning for Computer Vision: Memory usage and computational considerati...Deep Learning for Computer Vision: Memory usage and computational considerati...
Deep Learning for Computer Vision: Memory usage and computational considerati...Universitat Politècnica de Catalunya
 
CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...
CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...
CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...Thom Lane
 
Pr045 deep lab_semantic_segmentation
Pr045 deep lab_semantic_segmentationPr045 deep lab_semantic_segmentation
Pr045 deep lab_semantic_segmentationTaeoh Kim
 
Discovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI Projects
Discovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI ProjectsDiscovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI Projects
Discovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI ProjectsWee Hyong Tok
 
On-the-fly Visual Category Search in Web-scale Image Collections
On-the-fly Visual Category Search in Web-scale Image CollectionsOn-the-fly Visual Category Search in Web-scale Image Collections
On-the-fly Visual Category Search in Web-scale Image CollectionsKen Chatfield
 
convolutional_neural_networks in deep learning
convolutional_neural_networks in deep learningconvolutional_neural_networks in deep learning
convolutional_neural_networks in deep learningssusere5ddd6
 
Waste Classification System using Convolutional Neural Networks.pptx
Waste Classification System using Convolutional Neural Networks.pptxWaste Classification System using Convolutional Neural Networks.pptx
Waste Classification System using Convolutional Neural Networks.pptxJohnPrasad14
 
Conditional Image Generation with PixelCNN Decoders
Conditional Image Generation with PixelCNN DecodersConditional Image Generation with PixelCNN Decoders
Conditional Image Generation with PixelCNN Decoderssuga93
 
GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用
GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用
GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用NVIDIA Taiwan
 
Lecture 6: Convolutional Neural Networks
Lecture 6: Convolutional Neural NetworksLecture 6: Convolutional Neural Networks
Lecture 6: Convolutional Neural NetworksSang Jun Lee
 
DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...
DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...
DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...Joonhyung Lee
 
2017 (albawi-alkabi)image-net classification with deep convolutional neural n...
2017 (albawi-alkabi)image-net classification with deep convolutional neural n...2017 (albawi-alkabi)image-net classification with deep convolutional neural n...
2017 (albawi-alkabi)image-net classification with deep convolutional neural n...ali hassan
 
Network Implosion: Effective Model Compression for ResNets via Static Layer P...
Network Implosion: Effective Model Compression for ResNets via Static Layer P...Network Implosion: Effective Model Compression for ResNets via Static Layer P...
Network Implosion: Effective Model Compression for ResNets via Static Layer P...NTT Software Innovation Center
 
2022-01-17-Rethinking_Bisenet.pptx
2022-01-17-Rethinking_Bisenet.pptx2022-01-17-Rethinking_Bisenet.pptx
2022-01-17-Rethinking_Bisenet.pptxJAEMINJEONG5
 
AI optimizing HPC simulations (presentation from 6th EULAG Workshop)
AI optimizing HPC simulations (presentation from  6th EULAG Workshop)AI optimizing HPC simulations (presentation from  6th EULAG Workshop)
AI optimizing HPC simulations (presentation from 6th EULAG Workshop)byteLAKE
 
Deep learning for image video processing
Deep learning for image video processingDeep learning for image video processing
Deep learning for image video processingYu Huang
 
Devil in the Details: Analysing the Performance of ConvNet Features
Devil in the Details: Analysing the Performance of ConvNet FeaturesDevil in the Details: Analysing the Performance of ConvNet Features
Devil in the Details: Analysing the Performance of ConvNet FeaturesKen Chatfield
 

Similar to Convolutional Neural Network Models - Deep Learning (20)

Lecture 5: Convolutional Neural Network Models
Lecture 5: Convolutional Neural Network ModelsLecture 5: Convolutional Neural Network Models
Lecture 5: Convolutional Neural Network Models
 
Deep Learning for Computer Vision: Memory usage and computational considerati...
Deep Learning for Computer Vision: Memory usage and computational considerati...Deep Learning for Computer Vision: Memory usage and computational considerati...
Deep Learning for Computer Vision: Memory usage and computational considerati...
 
B.tech_project_ppt.pptx
B.tech_project_ppt.pptxB.tech_project_ppt.pptx
B.tech_project_ppt.pptx
 
CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...
CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...
CIFAR-10 for DAWNBench: Wide ResNets, Mixup Augmentation and "Super Convergen...
 
Pr045 deep lab_semantic_segmentation
Pr045 deep lab_semantic_segmentationPr045 deep lab_semantic_segmentation
Pr045 deep lab_semantic_segmentation
 
Discovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI Projects
Discovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI ProjectsDiscovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI Projects
Discovering Your AI Super Powers - Tips and Tricks to Jumpstart your AI Projects
 
On-the-fly Visual Category Search in Web-scale Image Collections
On-the-fly Visual Category Search in Web-scale Image CollectionsOn-the-fly Visual Category Search in Web-scale Image Collections
On-the-fly Visual Category Search in Web-scale Image Collections
 
convolutional_neural_networks in deep learning
convolutional_neural_networks in deep learningconvolutional_neural_networks in deep learning
convolutional_neural_networks in deep learning
 
Waste Classification System using Convolutional Neural Networks.pptx
Waste Classification System using Convolutional Neural Networks.pptxWaste Classification System using Convolutional Neural Networks.pptx
Waste Classification System using Convolutional Neural Networks.pptx
 
Conditional Image Generation with PixelCNN Decoders
Conditional Image Generation with PixelCNN DecodersConditional Image Generation with PixelCNN Decoders
Conditional Image Generation with PixelCNN Decoders
 
GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用
GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用
GTC Taiwan 2017 GPU 平台上導入深度學習於半導體產業之 EDA 應用
 
Cv mini project (1)
Cv mini project (1)Cv mini project (1)
Cv mini project (1)
 
Lecture 6: Convolutional Neural Networks
Lecture 6: Convolutional Neural NetworksLecture 6: Convolutional Neural Networks
Lecture 6: Convolutional Neural Networks
 
DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...
DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...
DeepLab V3+: Encoder-Decoder with Atrous Separable Convolution for Semantic I...
 
2017 (albawi-alkabi)image-net classification with deep convolutional neural n...
2017 (albawi-alkabi)image-net classification with deep convolutional neural n...2017 (albawi-alkabi)image-net classification with deep convolutional neural n...
2017 (albawi-alkabi)image-net classification with deep convolutional neural n...
 
Network Implosion: Effective Model Compression for ResNets via Static Layer P...
Network Implosion: Effective Model Compression for ResNets via Static Layer P...Network Implosion: Effective Model Compression for ResNets via Static Layer P...
Network Implosion: Effective Model Compression for ResNets via Static Layer P...
 
2022-01-17-Rethinking_Bisenet.pptx
2022-01-17-Rethinking_Bisenet.pptx2022-01-17-Rethinking_Bisenet.pptx
2022-01-17-Rethinking_Bisenet.pptx
 
AI optimizing HPC simulations (presentation from 6th EULAG Workshop)
AI optimizing HPC simulations (presentation from  6th EULAG Workshop)AI optimizing HPC simulations (presentation from  6th EULAG Workshop)
AI optimizing HPC simulations (presentation from 6th EULAG Workshop)
 
Deep learning for image video processing
Deep learning for image video processingDeep learning for image video processing
Deep learning for image video processing
 
Devil in the Details: Analysing the Performance of ConvNet Features
Devil in the Details: Analysing the Performance of ConvNet FeaturesDevil in the Details: Analysing the Performance of ConvNet Features
Devil in the Details: Analysing the Performance of ConvNet Features
 

More from Mohamed Loey

Lecture 6: Deep Learning Applications
Lecture 6: Deep Learning ApplicationsLecture 6: Deep Learning Applications
Lecture 6: Deep Learning ApplicationsMohamed Loey
 
Lecture 4: Deep Learning Frameworks
Lecture 4: Deep Learning FrameworksLecture 4: Deep Learning Frameworks
Lecture 4: Deep Learning FrameworksMohamed Loey
 
Lecture 4: How it Works: Convolutional Neural Networks
Lecture 4: How it Works: Convolutional Neural NetworksLecture 4: How it Works: Convolutional Neural Networks
Lecture 4: How it Works: Convolutional Neural NetworksMohamed Loey
 
Lecture 3: Convolutional Neural Networks
Lecture 3: Convolutional Neural NetworksLecture 3: Convolutional Neural Networks
Lecture 3: Convolutional Neural NetworksMohamed Loey
 
Lecture 2: Artificial Neural Network
Lecture 2: Artificial Neural NetworkLecture 2: Artificial Neural Network
Lecture 2: Artificial Neural NetworkMohamed Loey
 
Lecture 1: Deep Learning for Computer Vision
Lecture 1: Deep Learning for Computer VisionLecture 1: Deep Learning for Computer Vision
Lecture 1: Deep Learning for Computer VisionMohamed Loey
 
Design of an Intelligent System for Improving Classification of Cancer Diseases
Design of an Intelligent System for Improving Classification of Cancer DiseasesDesign of an Intelligent System for Improving Classification of Cancer Diseases
Design of an Intelligent System for Improving Classification of Cancer DiseasesMohamed Loey
 
Computer Security - CCNA Security - Lecture 2
Computer Security - CCNA Security - Lecture 2Computer Security - CCNA Security - Lecture 2
Computer Security - CCNA Security - Lecture 2Mohamed Loey
 
Computer Security - CCNA Security - Lecture 1
Computer Security - CCNA Security - Lecture 1Computer Security - CCNA Security - Lecture 1
Computer Security - CCNA Security - Lecture 1Mohamed Loey
 
Algorithms Lecture 8: Pattern Algorithms
Algorithms Lecture 8: Pattern AlgorithmsAlgorithms Lecture 8: Pattern Algorithms
Algorithms Lecture 8: Pattern AlgorithmsMohamed Loey
 
Algorithms Lecture 7: Graph Algorithms
Algorithms Lecture 7: Graph AlgorithmsAlgorithms Lecture 7: Graph Algorithms
Algorithms Lecture 7: Graph AlgorithmsMohamed Loey
 
Algorithms Lecture 6: Searching Algorithms
Algorithms Lecture 6: Searching AlgorithmsAlgorithms Lecture 6: Searching Algorithms
Algorithms Lecture 6: Searching AlgorithmsMohamed Loey
 
Algorithms Lecture 1: Introduction to Algorithms
Algorithms Lecture 1: Introduction to AlgorithmsAlgorithms Lecture 1: Introduction to Algorithms
Algorithms Lecture 1: Introduction to AlgorithmsMohamed Loey
 
Computer Security Lecture 4.1: DES Supplementary Material
Computer Security Lecture 4.1: DES Supplementary MaterialComputer Security Lecture 4.1: DES Supplementary Material
Computer Security Lecture 4.1: DES Supplementary MaterialMohamed Loey
 
PMP Lecture 4: Project Integration Management
PMP Lecture 4: Project Integration ManagementPMP Lecture 4: Project Integration Management
PMP Lecture 4: Project Integration ManagementMohamed Loey
 
Computer Security Lecture 4: Block Ciphers and the Data Encryption Standard
Computer Security Lecture 4: Block Ciphers and the Data Encryption StandardComputer Security Lecture 4: Block Ciphers and the Data Encryption Standard
Computer Security Lecture 4: Block Ciphers and the Data Encryption StandardMohamed Loey
 
Computer Security Lecture 3: Classical Encryption Techniques 2
Computer Security Lecture 3: Classical Encryption Techniques 2Computer Security Lecture 3: Classical Encryption Techniques 2
Computer Security Lecture 3: Classical Encryption Techniques 2Mohamed Loey
 
Computer Security Lecture 2: Classical Encryption Techniques 1
Computer Security Lecture 2: Classical Encryption Techniques 1Computer Security Lecture 2: Classical Encryption Techniques 1
Computer Security Lecture 2: Classical Encryption Techniques 1Mohamed Loey
 
Computer Security Lecture 1: Overview
Computer Security Lecture 1: OverviewComputer Security Lecture 1: Overview
Computer Security Lecture 1: OverviewMohamed Loey
 
PMP Lecture 3: Project Management Processes
PMP Lecture 3: Project Management ProcessesPMP Lecture 3: Project Management Processes
PMP Lecture 3: Project Management ProcessesMohamed Loey
 

More from Mohamed Loey (20)

Lecture 6: Deep Learning Applications
Lecture 6: Deep Learning ApplicationsLecture 6: Deep Learning Applications
Lecture 6: Deep Learning Applications
 
Lecture 4: Deep Learning Frameworks
Lecture 4: Deep Learning FrameworksLecture 4: Deep Learning Frameworks
Lecture 4: Deep Learning Frameworks
 
Lecture 4: How it Works: Convolutional Neural Networks
Lecture 4: How it Works: Convolutional Neural NetworksLecture 4: How it Works: Convolutional Neural Networks
Lecture 4: How it Works: Convolutional Neural Networks
 
Lecture 3: Convolutional Neural Networks
Lecture 3: Convolutional Neural NetworksLecture 3: Convolutional Neural Networks
Lecture 3: Convolutional Neural Networks
 
Lecture 2: Artificial Neural Network
Lecture 2: Artificial Neural NetworkLecture 2: Artificial Neural Network
Lecture 2: Artificial Neural Network
 
Lecture 1: Deep Learning for Computer Vision
Lecture 1: Deep Learning for Computer VisionLecture 1: Deep Learning for Computer Vision
Lecture 1: Deep Learning for Computer Vision
 
Design of an Intelligent System for Improving Classification of Cancer Diseases
Design of an Intelligent System for Improving Classification of Cancer DiseasesDesign of an Intelligent System for Improving Classification of Cancer Diseases
Design of an Intelligent System for Improving Classification of Cancer Diseases
 
Computer Security - CCNA Security - Lecture 2
Computer Security - CCNA Security - Lecture 2Computer Security - CCNA Security - Lecture 2
Computer Security - CCNA Security - Lecture 2
 
Computer Security - CCNA Security - Lecture 1
Computer Security - CCNA Security - Lecture 1Computer Security - CCNA Security - Lecture 1
Computer Security - CCNA Security - Lecture 1
 
Algorithms Lecture 8: Pattern Algorithms
Algorithms Lecture 8: Pattern AlgorithmsAlgorithms Lecture 8: Pattern Algorithms
Algorithms Lecture 8: Pattern Algorithms
 
Algorithms Lecture 7: Graph Algorithms
Algorithms Lecture 7: Graph AlgorithmsAlgorithms Lecture 7: Graph Algorithms
Algorithms Lecture 7: Graph Algorithms
 
Algorithms Lecture 6: Searching Algorithms
Algorithms Lecture 6: Searching AlgorithmsAlgorithms Lecture 6: Searching Algorithms
Algorithms Lecture 6: Searching Algorithms
 
Algorithms Lecture 1: Introduction to Algorithms
Algorithms Lecture 1: Introduction to AlgorithmsAlgorithms Lecture 1: Introduction to Algorithms
Algorithms Lecture 1: Introduction to Algorithms
 
Computer Security Lecture 4.1: DES Supplementary Material
Computer Security Lecture 4.1: DES Supplementary MaterialComputer Security Lecture 4.1: DES Supplementary Material
Computer Security Lecture 4.1: DES Supplementary Material
 
PMP Lecture 4: Project Integration Management
PMP Lecture 4: Project Integration ManagementPMP Lecture 4: Project Integration Management
PMP Lecture 4: Project Integration Management
 
Computer Security Lecture 4: Block Ciphers and the Data Encryption Standard
Computer Security Lecture 4: Block Ciphers and the Data Encryption StandardComputer Security Lecture 4: Block Ciphers and the Data Encryption Standard
Computer Security Lecture 4: Block Ciphers and the Data Encryption Standard
 
Computer Security Lecture 3: Classical Encryption Techniques 2
Computer Security Lecture 3: Classical Encryption Techniques 2Computer Security Lecture 3: Classical Encryption Techniques 2
Computer Security Lecture 3: Classical Encryption Techniques 2
 
Computer Security Lecture 2: Classical Encryption Techniques 1
Computer Security Lecture 2: Classical Encryption Techniques 1Computer Security Lecture 2: Classical Encryption Techniques 1
Computer Security Lecture 2: Classical Encryption Techniques 1
 
Computer Security Lecture 1: Overview
Computer Security Lecture 1: OverviewComputer Security Lecture 1: Overview
Computer Security Lecture 1: Overview
 
PMP Lecture 3: Project Management Processes
PMP Lecture 3: Project Management ProcessesPMP Lecture 3: Project Management Processes
PMP Lecture 3: Project Management Processes
 

Recently uploaded

On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsMebane Rash
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfPoh-Sun Goh
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxVishalSingh1417
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfChris Hunter
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIShubhangi Sonawane
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Shubhangi Sonawane
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 

Recently uploaded (20)

On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 

Convolutional Neural Network Models - Deep Learning

  • 2. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 3. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 4. CNN Models  Convolutional Neural Network (CNN)is a multi-layer neural network  Convolutional Neural Network is comprised of one or more convolutional layers (often with a pooling layers) and then followed by one or more fully connected layers.
  • 5. CNN Models  Convolutional layer acts as a feature extractor that extracts features of the inputs such as edges, corners , endpoints.
  • 6. CNN Models  Pooling layer reduces the resolution of the image that reduce the precision of the translation (shift and distortion) effect.
  • 7. CNN Models  fully connected layer have full connections to all activations in the previous layer.  Fully connect layer act as classifier.
  • 8. CNN Models Output Image = ( ( (ImageSize+2*Padding)- KernalSize )/ Stride) +1
  • 9. CNN Models  Conv 3x3 with stride=1,padding=0 6x6 Image 4x4
  • 10. CNN Models  Conv 3x3 with stride=1,padding=1 4x4 Image 4x4
  • 11. CNN Models  Conv 3x3 with stride=2,padding=0 7x7 Image 3x3
  • 12. CNN Models  Conv 3x3 with stride=2,padding=1 5x5 Image 3x3
  • 13. CNN Models  MaxPooling 2x2 with stride=2 4x4 Image 2x2
  • 14. CNN Models  MaxPooling 3x3 with stride=2 7x7 Image 3x3
  • 15. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 16. CNN Models  ImageNet Large Scale Visual Recognition Challenge is image classification challenge to create model that can correctly classify an input image into 1,000 separate object categories. Models are trained on 1.2 million training images with another 50,000 images for validation and 150,000 images for testing
  • 17. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 18. CNN Models  AlexNet achieve on ILSVRC 2012 competition 15.3% Top-5 error rate compare to 26.2% achieved by the second best entry.  AlexNet using batch stochastic gradient descent on training, with specific values for momentum and weight decay.  AlexNet implement dropout layers in order to combat the problem of overfitting to the training data.
  • 19. CNN Models Image Conv1 Pool1 Conv2 Pool2 Conv3 Conv4 Conv5 Pool3 FC1 FC2 FC3  AlexNet has 8 layers without count pooling layers.  AlexNet use ReLU for the nonlinearity functions  AlexNet trained on two GTX 580 GPUs for five to six days
  • 20. CNN Models Image 227x227x3 Conv11-96 Maxpool Conv5-256 MaxpoolConv3-384Conv3-384Conv3-256 Maxpool FC-4096 FC-4096 FC-1000
  • 22. CNN Models  Layer 0: Input image  Size: 227 x 227 x 3  Memory: 227 x 227 x 3
  • 23. CNN Models  Layer 0: 227 x 227 x 3  Layer 1: Convolution with 96 filters, size 11×11, stride 4, padding 0  Outcome Size= 55 x 55 x 96  (227-11)/4 + 1 = 55 is size of outcome  Memory: 55 x 55 x 96 x 3 (because of ReLU & LRN(Local Response Normalization))  Weights (parameters) : 11 x 11 x 3 x 96
  • 24. CNN Models  Layer 1: 55 x 55 x 96  Layer 2: Max-Pooling with 3×3 filter, stride 2  Outcome Size= 27 x 27 x 96  (55 – 3)/2 + 1 = 27 is size of outcome  Memory: 27 x 27 x 96
  • 25. CNN Models  Layer 2: 27 x 27 x 96  Layer 3: Convolution with 256 filters, size 5×5, stride 1, padding 2  Outcome Size = 27 x 27 x 256  original size is restored because of padding  Memory: 27 x 27 x 256 x 3 (because of ReLU and LRN)  Weights: 5 x 5 x 96 x 256
  • 26. CNN Models  Layer 3: 27 x 27 x 256  Layer 4: Max-Pooling with 3×3 filter, stride 2  Outcome Size = 13 x 13 x 256  (27 – 3)/2 + 1 = 13 is size of outcome  Memory: 13 x 13 x 256
  • 27. CNN Models  Layer 4: 13 x 13 x 256  Layer 5: Convolution with 384 filters, size 3×3, stride 1, padding 1  Outcome Size = 13 x 13 x 384  the original size is restored because of padding (13+2 -3)/1 +1 =13  Memory: 13 x 13 x 384 x 2 (because of ReLU)  Weights: 3 x 3 x 256 x 384
  • 28. CNN Models  Layer 5: 13 x 13 x 384  Layer 6: Convolution with 384 filters, size 3×3, stride 1, padding 1  Outcome Size = 13 x 13 x 384  the original size is restored because of padding  Memory: 13 x 13 x 384 x 2 (because of ReLU)  Weights: 3 x 3 x 384 x 384
  • 29. CNN Models  Layer 6: 13 x 13 x 384  Layer 7: Convolution with 256 filters, size 3×3, stride 1, padding 1  Outcome Size = 13 x 13 x 256  the original size is restored because of padding  Memory: 13 x 13 x 256 x 2 (because of ReLU)  Weights: 3 x 3 x 384 x 256
  • 30. CNN Models  Layer 7: 13 x 13 x 256  Layer 8: Max-Pooling with 3×3 filter, stride 2  Outcome Size = 6 x 6 x 256  (13 – 3)/2 + 1 = 6 is size of outcome  Memory: 6 x 6 x 256
  • 31. CNN Models  Layer 8: 6x6x256=9216 pixels are fed to FC  Layer 9: Fully Connected with 4096 neuron  Memory: 4096 x 3 (because of ReLU and Dropout)  Weights: 4096 x (6 x 6 x 256)
  • 32. CNN Models  Layer 9: Fully Connected with 4096 neuron  Layer 10: Fully Connected with 4096 neuron  Memory: 4096 x 3 (because of ReLU and Dropout)  Weights: 4096 x 4096
  • 33. CNN Models  Layer 10: Fully Connected with 4096 neuron  Layer 11: Fully Connected with 1000 neurons  Memory: 1000  Weights: 4096 x 1000
  • 34. CNN Models  Total (label and softmax not included)  Memory: 2.24 million  Weights: 62.37 million
  • 35. CNN Models  first use of ReLU  Alexnet used Norm layers  Alexnet heavy used data augmentation  Alexnet use dropout 0.5  Alexnet batch size is 128  Alexnet used SGD Momentum 0.9  Alexnet used learning rate 1e-2, reduced by 10
  • 36. CNN Models [227x227x3] INPUT [55x55x96] CONV1 : 96 11x11 filters at stride 4, pad 0 27x27x96] MAX POOL1 : 3x3 filters at stride 2 [27x27x96] NORM1: Normalization layer [27x27x256] CONV2: 256 5x5 filters at stride 1, pad 2 [13x13x256] MAX POOL2: 3x3 filters at stride 2 [13x13x256] NORM2: Normalization layer
  • 37. CNN Models [13x13x384] CONV3: 384 3x3 filters at stride 1, pad 1 [13x13x384] CONV4: 384 3x3 filters at stride 1, pad 1 [13x13x256] CONV5: 256 3x3 filters at stride 1, pad 1 [6x6x256] MAX POOL3: 3x3 filters at stride 2 [4096] FC6: 4096 neurons [4096] FC7: 4096 neurons [1000] FC8: 1000 neurons
  • 38. CNN Models  Implement AlexNet using TFLearn
  • 41. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 42. CNN Models  ZFNet the winner of the competition ILSVRC 2013 with 14.8% Top-5 error rate  ZFNet built by Matthew Zeiler and Rob Fergus  ZFNet has the same global architecture as Alexnet, that is to say 5 convolutional layers, two fully connected layers and an output softmax one. The differences are for example better sized convolutional kernels.
  • 43. CNN Models  ZFNet used filters of size 7x7 and a decreased stride value, instead of using 11x11 sized filters in the first layer (which is what AlexNet implemented).  ZFNet trained on a GTX 580 GPU for twelve days.  Developed a visualization technique named Deconvolutional Network “deconvnet” because it maps features to pixels.
  • 44. CNN Models AlexNet but: • CONV1: change from (11x11 stride 4) to (7x7 stride 2) • CONV3,4,5: instead of 384, 384, 256 filters use 512, 1024, 512
  • 45. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 46. CNN Models  Keep it deep. Keep it simple.  VGGNet the runner up of the competition ILSVRC 2014 with 7.3% Top-5 error rate.  VGGNet use of only 3x3 sized filters is quite different from AlexNet’s 11x11 filters in the first layer and ZFNet’s 7x7 filters.  two 3x3 conv layers have an effective receptive field of 5x5  Three 3x3 conv layers have an effective receptive field of 7x7  VGGNet trained on 4 Nvidia Titan Black GPUs for two to three weeks
  • 47. CNN Models  Interesting to notice that the number of filters doubles after each maxpool layer. This reinforces the idea of shrinking spatial dimensions, but growing depth.  VGGNet used scale jittering as one data augmentation technique during training  VGGNet used ReLU layers after each conv layer and trained with batch gradient descent
  • 49. CNN Models Input 224x224x3 Conv3-64 Conv3-64 Maxpool Conv3-128 Conv3-128 MaxpoolConv3-256Conv3-256Conv3-256MaxpoolConv3-512 Conv3-512 Conv3-512 Maxpool Conv3-512 Conv3-512 Conv3-512 MaxpoolFC-4096FC-4096FC-1000VGGNet 16
  • 51. CNN Models Input 224x224x3 Conv3-64 Conv3-64 Maxpool Conv3-128 Conv3-128 Maxpool Conv3-256Conv3-256Conv3-256Conv3-256MaxpoolConv3-512Conv3-512 Conv3-512 Conv3-512 Maxpool Conv3-512 Conv3-512 Conv3-512 Conv3-512 MaxpoolFC-4096FC-4096FC-1000 VGGNet 19
  • 52. CNN Models  Implement VGGNet16 using TFLearn
  • 56. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 57. CNN Models  GoogleNet is the winner of the competition ILSVRC 2014 with 6.7% Top-5 error rate.  GoogleNet Trained on “a few high-end GPUs with in a week”  GoogleNet uses 12x fewer parameters than AlexNet  GoogleNet use an average pool instead of fully connected layers, to go from a 7x7x1024 volume to a 1x1x1024 volume. This saves a huge number of parameters.
  • 58. CNN Models  GoogleNet used 9 Inception modules in the whole architecture  This 1x1 convolutions (bottleneck convolutions) allow to control/reduce the depth dimension which greatly reduces the number of used parameters due to removal of redundancy of correlated filters.  GoogleNet has 22 Layers deep network
  • 59. CNN Models  GoogleNet use an average pool instead of using FC-Layer, to go from a 7x7x1024 volume to a 1x1x1024 volume. This saves a huge number of parameters.  GoogleNet use inexpensive Conv1 to compute reduction before the expensive Conv3 and Conv5  Conv1 follow by Relu to reduce overfitting
  • 61. CNN Models Input 224x224x3 Conv7/2-64 Maxpool3/2 Conv1 Conv3/1- 192 Maxpool3/2 Inception3a 256 Inception3b 480 Maxpool3/2 Inception4a 512 Inception4b 512 Inception4c 512 Inception4d 528 Inception4e 832 Maxpool3/2 Inception5a 832 Inception5b 1024 Avgpool7/1 Dropout 40% FC-1000 Softmax- 1000GoogleNet
  • 63. CNN Models Type Size/ Stride Output Depth Conv1 # Conv3 Conv3 # Conv5 Conv5 Pool Param Ops Conv 7x7/2 112x112x64 1 - - - - - - 2.7K 34M Maxpool 3x3/2 56x56x64 0 - - - - - - - - Conv 3x3/1 56x56x192 2 - 64 192 - - - 112K 360M Maxpool 3x3/2 28x28x192 0 - - - - - - - - Inception 3a - 28x28x256 2 64 96 128 16 32 32 159K 128M Inception 3b - 28x28x480 2 128 128 192 32 96 64 380K 304M Maxpool 3x3/2 14x14x480 0 - - - - - - - - Inception 4a - 14x14x512 2 192 96 208 16 48 64 364K 73M Inception 4b - 14x14x512 2 160 112 224 24 64 64 437K 88M Inception 4c - 14x14x512 2 128 128 256 24 64 64 463K 100M Inception 4d - 14x14x528 2 112 144 288 32 64 64 580K 119M
  • 64. CNN Models Type Size/ Stride Output Depth Conv1 # Conv3 Conv3 # Conv5 Conv5 Pool Param Ops Inception 4e - 14x14x832 2 256 160 320 32 128 128 840K 170M Maxpool 3x3/2 7x7x832 0 - - - - - - - - Inception 5a - 7x7x832 2 256 160 320 32 128 128 1072K 54M Inception 5b - 7x7x1024 2 384 192 384 48 128 128 1388K 71M Avgpool 7x7/1 1x1x1024 0 - - - - - - - - Dropout .4 - 1x1x1024 0 - - - - - - - - Linear - 1x1x1024 1 - - - - - - 1000K 1M Softmax - 1x1x1024 0 - - - - - - - - Total Layers 22
  • 65. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 66. CNN Models  ResNet the winner of the competition ILSVRC 2015 with 3.6% Top-5 error rate.  ResNet mainly inspired by the philosophy of VGGNet.  ResNet proposed a residual learning approach to ease the difficulty of training deeper networks. Based on the design ideas of Batch Normalization (BN), small convolutional kernels.  ResNet is a new 152 layer network architecture.  ResNet Trained on an 8 GPU machine for two to three weeks
  • 67. CNN Models  Residual network  Keys:  No max pooling  No hidden fc  No dropout  Basic design (VGG-style)  All 3x3 conv (almost)  Batch normalization
  • 68. CNN Models Conv Layers Preserving base information can treat perturbation
  • 70. CNN Models  Residual Bottleneck consist of a 1×1 layer for reducing dimension, a 3×3 layer, and a 1×1 layer for restoring dimension.
  • 71. CNN Models Image Conv7/2- 64 Pool/2 Conv3-64 Conv3-64 Conv3-64 Conv3-64 Conv3-64 Conv3-64 Conv3/2- 128 Conv3-128Conv3-128Conv3-128Conv3-128Conv3-128Conv3-128 Conv3-128 Conv3/2- 256 Conv3-256 Conv3-256 Conv3-256 Conv3-256 Conv3-256 Conv3-256 Conv3-256Conv3-256Conv3-256Conv3-256Conv3-256 Conv3/2- 512 Conv3-512Conv3-512 Conv3-512 Conv3-512 Conv3-512 Avg pool FC-1000 ResNet 34
  • 72. CNN Models Image Conv7/2-64 Pool/2 2Conv3-64 2Conv3-64 2Conv3-64 2Conv3/2- 128 2Conv3-1282Conv3-1282Conv3-128 2Conv3/2- 256 2Conv3-256 2Conv3-256 2Conv3-256 2Conv3-256 2Conv3-256 2Conv3/2- 512 2Conv3-512 2Conv3-512Avg poolFC-1000ResNet 34
  • 74. CNN Models Layer Output 18-Layer 34-Layer 50-Layer 101-Layer 152-Layer Conv-1 112x112 7x7/2-64 Conv-2 56x56 3x3 Maxpooling/2 𝟐𝐱 𝟑𝐱𝟑, 𝟔𝟒 𝟑𝐱𝟑, 𝟔𝟒 𝟑𝐱 𝟑𝐱𝟑, 𝟔𝟒 𝟑𝐱𝟑, 𝟔𝟒 𝟑𝐱 𝟏𝐱𝟏, 𝟔𝟒 𝟑𝐱𝟑𝐱𝟔𝟒 𝟏𝐱𝟏𝐱𝟐𝟓𝟔 𝟑𝐱 𝟏𝐱𝟏, 𝟔𝟒 𝟑𝐱𝟑𝐱𝟔𝟒 𝟏𝐱𝟏𝐱𝟐𝟓𝟔 𝟑𝐱 𝟏𝐱𝟏, 𝟔𝟒 𝟑𝐱𝟑𝐱𝟔𝟒 𝟏𝐱𝟏𝐱𝟐𝟓𝟔 Conv-3 28x28 𝟐𝐱 𝟑𝐱𝟑, 𝟏𝟐𝟖 𝟑𝐱𝟑, 𝟏𝟐𝟖 𝟒𝐱 𝟑𝐱𝟑, 𝟏𝟐𝟖 𝟑𝐱𝟑, 𝟏𝟐𝟖 𝟒𝐱 𝟏𝐱𝟏, 𝟏𝟐𝟖 𝟑𝐱𝟑𝐱𝟏𝟐𝟖 𝟏𝐱𝟏𝐱𝟓𝟏𝟐 𝟒𝐱 𝟏𝐱𝟏, 𝟏𝟐𝟖 𝟑𝐱𝟑𝐱𝟏𝟐𝟖 𝟏𝐱𝟏𝐱𝟓𝟏𝟐 𝟖𝐱 𝟏𝐱𝟏, 𝟏𝟐𝟖 𝟑𝐱𝟑𝐱𝟏𝟐𝟖 𝟏𝐱𝟏𝐱𝟓𝟏𝟐 Conv-4 14x14 𝟐𝐱 𝟑𝐱𝟑, 𝟐𝟓𝟔 𝟑𝐱𝟑, 𝟐𝟓𝟔 𝟔𝐱 𝟑𝐱𝟑, 𝟐𝟓𝟔 𝟑𝐱𝟑, 𝟐𝟓𝟔 𝟔𝐱 𝟏𝐱𝟏, 𝟐𝟓𝟔 𝟑𝐱𝟑𝐱𝟐𝟓𝟔 𝟏𝐱𝟏𝐱𝟏𝟎𝟐𝟒 𝟐𝟑𝐱 𝟏𝐱𝟏, 𝟐𝟓𝟔 𝟑𝐱𝟑𝐱𝟐𝟓𝟔 𝟏𝐱𝟏𝐱𝟏𝟎𝟐𝟒 𝟑𝟔𝐱 𝟏𝐱𝟏, 𝟐𝟓𝟔 𝟑𝐱𝟑𝐱𝟐𝟓𝟔 𝟏𝐱𝟏𝐱𝟏𝟎𝟐𝟒 Conv-5 7x7 𝟐𝐱 𝟑𝐱𝟑, 𝟓𝟏𝟐 𝟑𝐱𝟑, 𝟓𝟏𝟐 𝟑𝐱 𝟑𝐱𝟑, 𝟓𝟏𝟐 𝟑𝐱𝟑, 𝟓𝟏𝟐 𝟑𝐱 𝟏𝐱𝟏, 𝟓𝟏𝟐 𝟑𝐱𝟑𝐱𝟓𝟏𝟐 𝟏𝐱𝟏𝐱𝟐𝟎𝟒𝟖 𝟑𝐱 𝟏𝐱𝟏, 𝟓𝟏𝟐 𝟑𝐱𝟑𝐱𝟓𝟏𝟐 𝟏𝐱𝟏𝐱𝟐𝟎𝟒𝟖 𝟑𝐱 𝟏𝐱𝟏, 𝟓𝟏𝟐 𝟑𝐱𝟑𝐱𝟓𝟏𝟐 𝟏𝐱𝟏𝐱𝟐𝟎𝟒𝟖 1x1 Avgpool-FC1000-Softmax Flops 𝟏. 𝟖𝐱𝟏𝟎 𝟗 𝟑. 𝟔𝐱𝟏𝟎 𝟗 𝟑. 𝟖𝐱𝟏𝟎 𝟗 𝟕. 𝟔𝐱𝟏𝟎 𝟗 𝟏𝟏. 𝟑𝐱𝟏𝟎 𝟗
  • 75. CNN Models  Implement ResNet using TFLearn
  • 79. CNN Models Convolutional Neural Network ILSVRC AlexNet (2012) ZFNet (2013) VGGNet (2014) GoogleNet 2014) ResNet (2015) Conclusion
  • 80. CNN Models 26.2 15.3 14.8 7.3 6.7 3.6 0 5 10 15 20 25 30 Before 2012 AlexNet 2012 ZFNet 2013 VGGNet 2014 GoogleNet 2014 ResNet 2015
  • 83. CNN Models www.YourCompany.com © 2020 Companyname PowerPoint Business Theme. All Rights Reserved. THANKS FOR YOUR TIME