SlideShare una empresa de Scribd logo
1 de 9
Descargar para leer sin conexión
1
The Perceptron and its Learning Rule
Carlo U. Nicola, SGI FH Aargau
With extracts from publications of :
M. Minsky, MIT, Demuth, U. of Colorado,
D.J. C. MacKay, Cambridge University
WBS WS06-07 2
Perceptron
(i) Single layer ANN
(ii) It works with continuous or binary inputs
(iii) It stores pattern pairs (Ak,Ck) where: Ak = (a1
k, …, an
k)
and Ck = (c1
k, …, cn
k) are bipolar valued [-1, +1].
(iv) It applies the perceptron error-correction procedure, which
always converges.
(v) A perceptron is a classifier.
Bias b is sometimes called θ
2
WBS WS06-07 3
Perceptron convergence procedure (1)
Step1: Initialize weights and thresholds: Set the wij and the bj to small
random values in the range [-1,+1].
Step2: Present new continuous valued input p0, p1,L, pn along
with the desired output vector T = (t0, L , tn).
Step3: Calculate actual output: aj = f(Σwijpi + bj) where f(.) = hardlim(.)
= sgn(.).
Step 4: When an error occurs adapt the weights with:
wij
new = wij
old + η[tj - aj] × pi where: 0 < η · 1 (learning rate)
and: bnew = bold + e where: e = η[tj - aj]
Step 5: Repeat by going to step 2 until no error.
WBS WS06-07 4
The important propriety of the perceptron rule
The algorithm outlined in the precedent slide, will always
converge to weights that solve the desired classification
problem, assuming that such weights do exist.
We will see, shortly, that such weights exist, as long as the
classification problem is linearly separable.
3
WBS WS06-07 5
Example
The discrimination line between two classes in a two input
perceptron is determined by: w1x1 + w2x2 + θ = 0.
We plot this line in a x1, x2 plane as: x2 = - (w1/w2)x1 - θ/w2
Class 1
Class 2
In this case the classification
problem is neatly linear
separable and the perceptron
rule works very well.
WBS WS06-07 6
Example of perceptron learning rule
A perceptron is initialized with the following weights: w1= 1; w2 = 2; θ = -2. The
first point A : x= (0.5, 1.5) and desired output t(x)= +1 is presented to the
network. From step 3, it can be calculated that the network output is +1, so no
weights are adjusted. The same is the case for point B, with values x = (-0.5,
0.5) and target value t(x)= -1. Point C: x= (0.5, 0.5) gives an output -1, while
the expected target is t(x)= +1. According to the learning rule we change the
weights as follows: w1= 1.5; w2 = 2.5; θ = -1. ( η = 0.5; pC = 0.5; → ∆w1= 0.5; ∆
w2 =0.5; ∆ θ = 1). Now point C too, is correctly classified.
4
WBS WS06-07 7
Failure of the perceptron ANN to solve the XOR problem
3
1 2
w13 w23
The XOR-problem (see table):
i1 xor i2i1 i2
-1 -1 -1
1 -1 1
-1 1 1
1 1 -1
Outputi =
1 if ∑ wij ii > hi
-1 if ∑ wij ii · hi
(i) i1 = 1, i2 = 1: w13+ w23· h3
(i), (ii), (iii) and (iv) lead to a contradiction!
(ii) i1 = -1, i2 = -1 : -(w13+ w23) · h3
(iii) i1 = 1, i2 = -1: w13 - w23 > h3 (iv) i1 = -1 , i2 = 1 : w23 - w13 > h3
The xor-problem is not linearly separable
WBS WS06-07 8
Linearly separable and not separable problems
XOR is not a linearly
separable classification
problem
5
WBS WS06-07 9
Perceptrons net with hidden layer
Input
Output
0 1
2 3
4
v02 v13
v12
v03
w24 w34
Feedforward hidden layer
Now this perceptrons network can solve the XOR-problem!
Show this.
WBS WS06-07 10
Transfer function for perceptron
6
WBS WS06-07 11
A more complex example
WBS WS06-07 12
Prototype vectors
We take 3 parameters to differentiate between a banana and a apple:
shape, texture, weight.
Shape: {1: round; -1: elliptical}
Texture: {1: smooth; -1: rough}
Weight: {1: > 100 g; -1: < 100 g}
We can now define a prototype banana and apple:
7
WBS WS06-07 13
Apple/banana example
Training set:
Initial weights:
First iteration:
η = 1 in this example
WBS WS06-07 14
Second iteration
8
WBS WS06-07 15
Checking the solution (test vectors)
WBS WS06-07 16
Checking the solution (testing the network)
The net recovers the correct answer from noisy information:
9
WBS WS06-07 17
Summary
A Perceptron net is:
– A feedforward network
– Which builds linear decision boundaries
– and uses one artificial neuron for each decision.

Más contenido relacionado

La actualidad más candente

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
DEEPASHRI HK
 

La actualidad más candente (20)

Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Image transforms
Image transformsImage transforms
Image transforms
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural network
 
Histogram Processing
Histogram ProcessingHistogram Processing
Histogram Processing
 
Chapter 5 Image Processing: Fourier Transformation
Chapter 5 Image Processing: Fourier TransformationChapter 5 Image Processing: Fourier Transformation
Chapter 5 Image Processing: Fourier Transformation
 
Resnet
ResnetResnet
Resnet
 
Multirate DSP
Multirate DSPMultirate DSP
Multirate DSP
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
introduction to Digital Image Processing
introduction to Digital Image Processingintroduction to Digital Image Processing
introduction to Digital Image Processing
 
03 Single layer Perception Classifier
03 Single layer Perception Classifier03 Single layer Perception Classifier
03 Single layer Perception Classifier
 
Fast Fourier Transform
Fast Fourier TransformFast Fourier Transform
Fast Fourier Transform
 
Image enhancement
Image enhancementImage enhancement
Image enhancement
 
Introduction to wavelet transform
Introduction to wavelet transformIntroduction to wavelet transform
Introduction to wavelet transform
 
Spatial Filters (Digital Image Processing)
Spatial Filters (Digital Image Processing)Spatial Filters (Digital Image Processing)
Spatial Filters (Digital Image Processing)
 
Basics of Digital Filters
Basics of Digital FiltersBasics of Digital Filters
Basics of Digital Filters
 
AlexNet
AlexNetAlexNet
AlexNet
 
Power method
Power methodPower method
Power method
 
Cnn
CnnCnn
Cnn
 
Chapter 1 and 2 gonzalez and woods
Chapter 1 and 2 gonzalez and woodsChapter 1 and 2 gonzalez and woods
Chapter 1 and 2 gonzalez and woods
 
Activation function
Activation functionActivation function
Activation function
 

Similar a The Perceptron and its Learning Rule

latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptx
MdMahfoozAlam5
 
Transportation and assignment_problem
Transportation and assignment_problemTransportation and assignment_problem
Transportation and assignment_problem
Ankit Katiyar
 
Introduction to Artificial Neural Networks - PART III.pdf
Introduction to Artificial Neural Networks - PART III.pdfIntroduction to Artificial Neural Networks - PART III.pdf
Introduction to Artificial Neural Networks - PART III.pdf
SasiKala592103
 

Similar a The Perceptron and its Learning Rule (20)

2-Perceptrons.pdf
2-Perceptrons.pdf2-Perceptrons.pdf
2-Perceptrons.pdf
 
latest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptxlatest TYPES OF NEURAL NETWORKS (2).pptx
latest TYPES OF NEURAL NETWORKS (2).pptx
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01Epsrcws08 campbell isvm_01
Epsrcws08 campbell isvm_01
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
lecture9-support vector machines algorithms_ML-1.ppt
lecture9-support vector machines algorithms_ML-1.pptlecture9-support vector machines algorithms_ML-1.ppt
lecture9-support vector machines algorithms_ML-1.ppt
 
Two algorithms to accelerate training of back-propagation neural networks
Two algorithms to accelerate training of back-propagation neural networksTwo algorithms to accelerate training of back-propagation neural networks
Two algorithms to accelerate training of back-propagation neural networks
 
AI Lesson 38
AI Lesson 38AI Lesson 38
AI Lesson 38
 
Lesson 38
Lesson 38Lesson 38
Lesson 38
 
Supporting Vector Machine
Supporting Vector MachineSupporting Vector Machine
Supporting Vector Machine
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Gentle intro to SVM
Gentle intro to SVMGentle intro to SVM
Gentle intro to SVM
 
Output Primitive and Brenshamas Line.pptx
Output Primitive and Brenshamas Line.pptxOutput Primitive and Brenshamas Line.pptx
Output Primitive and Brenshamas Line.pptx
 
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
 
P1 . norm vector space
P1 . norm vector spaceP1 . norm vector space
P1 . norm vector space
 
Classification using perceptron.pptx
Classification using perceptron.pptxClassification using perceptron.pptx
Classification using perceptron.pptx
 
Transportation and assignment_problem
Transportation and assignment_problemTransportation and assignment_problem
Transportation and assignment_problem
 
Introduction to Artificial Neural Networks - PART III.pdf
Introduction to Artificial Neural Networks - PART III.pdfIntroduction to Artificial Neural Networks - PART III.pdf
Introduction to Artificial Neural Networks - PART III.pdf
 
Chapter 3 Output Primitives
Chapter 3 Output PrimitivesChapter 3 Output Primitives
Chapter 3 Output Primitives
 
lecture14-SVMs (1).ppt
lecture14-SVMs (1).pptlecture14-SVMs (1).ppt
lecture14-SVMs (1).ppt
 

Más de Noor Ul Hudda Memon

Más de Noor Ul Hudda Memon (13)

Neuro Linguistic Programming (artificial intelligence)
Neuro Linguistic Programming (artificial intelligence)Neuro Linguistic Programming (artificial intelligence)
Neuro Linguistic Programming (artificial intelligence)
 
(Ch#1) artificial intelligence
(Ch#1) artificial intelligence(Ch#1) artificial intelligence
(Ch#1) artificial intelligence
 
Sqa lec. 07
Sqa lec. 07Sqa lec. 07
Sqa lec. 07
 
Software estimation models ii lec .05
Software estimation models ii lec .05Software estimation models ii lec .05
Software estimation models ii lec .05
 
Se notes
Se notesSe notes
Se notes
 
Risk management lec. 06
Risk management lec. 06Risk management lec. 06
Risk management lec. 06
 
Proj mgmt complete)
Proj mgmt complete)Proj mgmt complete)
Proj mgmt complete)
 
Pressman ch-3-prescriptive-process-models
Pressman ch-3-prescriptive-process-modelsPressman ch-3-prescriptive-process-models
Pressman ch-3-prescriptive-process-models
 
Ch04 agile development models
Ch04 agile development modelsCh04 agile development models
Ch04 agile development models
 
Ch03 process models
Ch03 process modelsCh03 process models
Ch03 process models
 
Agiel sw development
Agiel sw developmentAgiel sw development
Agiel sw development
 
Voice controlled robot ppt
Voice controlled robot pptVoice controlled robot ppt
Voice controlled robot ppt
 
bgp(border gateway protocol)
bgp(border gateway protocol)bgp(border gateway protocol)
bgp(border gateway protocol)
 

Último

1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 

Último (20)

Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Role Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptxRole Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptx
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 

The Perceptron and its Learning Rule

  • 1. 1 The Perceptron and its Learning Rule Carlo U. Nicola, SGI FH Aargau With extracts from publications of : M. Minsky, MIT, Demuth, U. of Colorado, D.J. C. MacKay, Cambridge University WBS WS06-07 2 Perceptron (i) Single layer ANN (ii) It works with continuous or binary inputs (iii) It stores pattern pairs (Ak,Ck) where: Ak = (a1 k, …, an k) and Ck = (c1 k, …, cn k) are bipolar valued [-1, +1]. (iv) It applies the perceptron error-correction procedure, which always converges. (v) A perceptron is a classifier. Bias b is sometimes called θ
  • 2. 2 WBS WS06-07 3 Perceptron convergence procedure (1) Step1: Initialize weights and thresholds: Set the wij and the bj to small random values in the range [-1,+1]. Step2: Present new continuous valued input p0, p1,L, pn along with the desired output vector T = (t0, L , tn). Step3: Calculate actual output: aj = f(Σwijpi + bj) where f(.) = hardlim(.) = sgn(.). Step 4: When an error occurs adapt the weights with: wij new = wij old + η[tj - aj] × pi where: 0 < η · 1 (learning rate) and: bnew = bold + e where: e = η[tj - aj] Step 5: Repeat by going to step 2 until no error. WBS WS06-07 4 The important propriety of the perceptron rule The algorithm outlined in the precedent slide, will always converge to weights that solve the desired classification problem, assuming that such weights do exist. We will see, shortly, that such weights exist, as long as the classification problem is linearly separable.
  • 3. 3 WBS WS06-07 5 Example The discrimination line between two classes in a two input perceptron is determined by: w1x1 + w2x2 + θ = 0. We plot this line in a x1, x2 plane as: x2 = - (w1/w2)x1 - θ/w2 Class 1 Class 2 In this case the classification problem is neatly linear separable and the perceptron rule works very well. WBS WS06-07 6 Example of perceptron learning rule A perceptron is initialized with the following weights: w1= 1; w2 = 2; θ = -2. The first point A : x= (0.5, 1.5) and desired output t(x)= +1 is presented to the network. From step 3, it can be calculated that the network output is +1, so no weights are adjusted. The same is the case for point B, with values x = (-0.5, 0.5) and target value t(x)= -1. Point C: x= (0.5, 0.5) gives an output -1, while the expected target is t(x)= +1. According to the learning rule we change the weights as follows: w1= 1.5; w2 = 2.5; θ = -1. ( η = 0.5; pC = 0.5; → ∆w1= 0.5; ∆ w2 =0.5; ∆ θ = 1). Now point C too, is correctly classified.
  • 4. 4 WBS WS06-07 7 Failure of the perceptron ANN to solve the XOR problem 3 1 2 w13 w23 The XOR-problem (see table): i1 xor i2i1 i2 -1 -1 -1 1 -1 1 -1 1 1 1 1 -1 Outputi = 1 if ∑ wij ii > hi -1 if ∑ wij ii · hi (i) i1 = 1, i2 = 1: w13+ w23· h3 (i), (ii), (iii) and (iv) lead to a contradiction! (ii) i1 = -1, i2 = -1 : -(w13+ w23) · h3 (iii) i1 = 1, i2 = -1: w13 - w23 > h3 (iv) i1 = -1 , i2 = 1 : w23 - w13 > h3 The xor-problem is not linearly separable WBS WS06-07 8 Linearly separable and not separable problems XOR is not a linearly separable classification problem
  • 5. 5 WBS WS06-07 9 Perceptrons net with hidden layer Input Output 0 1 2 3 4 v02 v13 v12 v03 w24 w34 Feedforward hidden layer Now this perceptrons network can solve the XOR-problem! Show this. WBS WS06-07 10 Transfer function for perceptron
  • 6. 6 WBS WS06-07 11 A more complex example WBS WS06-07 12 Prototype vectors We take 3 parameters to differentiate between a banana and a apple: shape, texture, weight. Shape: {1: round; -1: elliptical} Texture: {1: smooth; -1: rough} Weight: {1: > 100 g; -1: < 100 g} We can now define a prototype banana and apple:
  • 7. 7 WBS WS06-07 13 Apple/banana example Training set: Initial weights: First iteration: η = 1 in this example WBS WS06-07 14 Second iteration
  • 8. 8 WBS WS06-07 15 Checking the solution (test vectors) WBS WS06-07 16 Checking the solution (testing the network) The net recovers the correct answer from noisy information:
  • 9. 9 WBS WS06-07 17 Summary A Perceptron net is: – A feedforward network – Which builds linear decision boundaries – and uses one artificial neuron for each decision.