More Related Content Similar to Convolutional Neural Networks (CNN) — 卷積神經網路的前世今生 (19) Convolutional Neural Networks (CNN) — 卷積神經網路的前世今生1. Jason Tsai (蔡志順) 2019.11.19
台北市 3A中心 教師研習
Convolutional Neural Networks (CNN)
卷積神經網路的前世今生
*Picture adopted from https://bit.ly/36yLPJL
2. Copyright Notice:
All figures in this presentation are taken from
miscellaneous sources and their copyrights
belong to the original authors. This
presentation itself adopts Creative Commons
license.
26. 卷積層 (Convolutional layer)
Depth (D): filter (或稱 kernel) 數目
Stride (s): 每一次 kernel 移動的間隔
Zero padding (p): 每一輸入邊緣填0的寬度
若以 i 表示輸入寬度大小,k 表示 kernel
寬度大小, 卷積運算後 feature map 的寬
度大小 (o) 公式為:
o = D 個 [(i - k + 2p) / s] + 1
以輸入為28x28,5x5卷積核,stride為1,padding為2為
例,輸出大小仍為 28x28 feature map。
31. 區域感受野 (Local receptive field)
稀疏連結 (Sparse connectivity)
全連接網路
Fully connection networks
卷積神經網路
Convolutional neural networks
(輸入)
34. 反卷積 (Transpose convolution /
Deconvolution)
stride (步長) = 1 (p’ = k – p – 1)
公式: o’ = ( i’ – 1) + k – 2p
stride (步長) = 2 (dilation=s)
公式: o’ = s (i’ – 1) + k - 2p
43. 損失函數 (Loss function) / 目標函數
(Objective function)
P(x)為目標機率,Q(x)為
實際機率。
最小化損失函數 L
θ* = arg min L(θ)
Mean square error (最小均方差)
Cross entropy (交叉熵)
Video tutorial: https://youtu.be/ErfnhcEV1O8
50. 使用 PyTorch 實現 LeNet-5 模型
import torch.nn as nn
Import torch.nn.functional as F
class LeNet5(nn.Module):
def __init__(self):
super().__init__()
self.conv1 = nn.Conv2d(in_channels=1, out_channels=6, kernel_size=5)
self.conv2 = nn.Conv2d(in_channels=6, out_channels=16, kernel_size=5)
self.mxpol = nn.MaxPool2d(kernel_size=2, stride=2)
self.fc1 = nn.Linear(16*5*5, 120)
self.fc2 = nn.Linear(120, 84)
self.fc3 = nn.Linear(84, 10)
def forward(self, x):
x = F.relu(self.conv1(x))
x = self.mxpol(x)
x = F.relu(self.conv2(x))
x = self.mxpol(x)
x = x.view(x.size(0), -1)
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
return self.fc3(x)
net = LeNet5()
此處激活函數改用現在普遍被
採用的 ReLU
54. Residual block: y = F(x)+x
Paper: https://arxiv.org/abs/1512.03385
Deep residual networks
ResNet (2015)