SlideShare una empresa de Scribd logo
1 de 12
1
Machine Learning: Lecture 7
Instance-Based
Learning (IBL)
(Based on Chapter 8 of Mitchell T..,
Machine Learning, 1997)
2
General Description
 IBL methods learn by simply storing the presented
training data.
 When a new query instance is encountered, a set of
similar related instances is retrieved from memory and
used to classify the new query instance.
 IBL approaches can construct a different approximation
to the target function for each distinct query. They can
construct local rather than global approximations.
 IBL methods can use complex symbolic representations
for instances. This is called Case-Based Reasoning
(CBR).
3
Advantages and Disadvantages of
IBL Methods
Advantage: IBL Methods are particularly well suited
to problems in which the target function is very
complex, but can still be described by a collection of
less complex local approximations.
Disadvantage I: The cost of classifying new instances
can be high (since most of the computation takes
place at this stage).
Disadvantage II: Many IBL approaches typically
consider all attributes of the instances ==> they are
very sensitive to the curse of dimensionality!
4
k-Nearest Neighbour Learning
 Assumption: All instances, x, correspond to points in the
n-dimensional space Rn. x =<a1(x), a2(x)…an(x)>.
 Measure Used: Euclidean Distance:
d(xi,xj)= r=1
n (ar(xi)-ar(xj))2
 Training Algorithm:
 For each training example <x,f(x)>, add the example to
the list training_examples.
 Classification Algorithm: Given a query instance xq to
be classified:
• Let x1…xk be the k instances from
training_examples that are nearest to xq.
• Return f^(xq) <- argmaxvVr=1
n (v,f(xi))
• where (a,b)=1 if a=b and (a,b)=0 otherwise.
5
Example
+
+
-
-
-
: query, xq
1-NN: +
5-NN: -
Decision Surface
for 1-NN
6
Distance-Weighted Nearest
Neighbour
 k-NN can be refined by weighing the
contribution of the k neighbours according to
their distance to the query point xq, giving
greater weight to closer neighbours.
 To do so, replace the last line of the algorithm
with
f^(xq) <- argmaxvVr=1
n wi(v,f(xi))
where wi=1/d(xq,xi)2
7
Remarks on k-NN
 k-NN can be used for regression instead of
classification.
 k-NN is robust to noise and, it is generally
quite a good classifier.
 k-NN’s disadvantage is that it uses all
attributes to classify instances
 Solution 1: weigh the attributes differently (use
cross-validation to determine the weights)
 Solution 2: eliminate the least relevant attributes
(again, use cross-validation to determine which
attributes to eliminate)
8
Locally Weighted Regression
 Locally weighted regression generalizes
nearest-neighbour approaches by
constructing an explicit approximation to f
over a local region surrounding xq.
 In such approaches, the contribution of each
training example is weighted by its distance
to the query point.
9
An Example: Locally Weighted
Linear Regression
 f is approximated by: f^(x)=w0+w1a1(x)+…+wnan(x)
 Gradient descent can be used to find the coefficients
w0, w1,…wn that minimize some error function.
 The error function, however, should be different from the
one used in the Neural Net since we want a local
solution. Different possibilities:
 Minimize the squared error over just the k nearest
neighbours.
 Minimize the squared error over the entire training set
but weigh the contribution of each example by some
decreasing function K of its distance from xq.
 Combine 1 and 2
10
Radial Basis Function (RBF)
 Approximating Function:
f^(x)=w0+ u=1
k wu Ku(d(xu,x))
 Ku(d(xu,x)) is a kernel function that decreases as the
distance d(xu,x) increases (e.g., the Gaussian function);
and k is a user-defined constant that specifies the
number of kernel functions to be included.
 Although f^(x) is a global approximation to f(x) the
contribution of each kernel function is localized.
 RBF can be implemented in a neural network. It is a
very efficient two step algorithm:
• Find the parameters of the kernel functions (e.g.,
use the EM algorithm)
• Learn the linear weights of the kernel functions.
11
Case-Based Reasoning (CBR)
 CBR is similar to k-NN methods in that:
 They are lazy learning methods in that they defer
generalization until a query comes around.
 They classify new query instances by analyzing similar
instances while ignoring instances that are very
different from the query.
 However, CBR is different from k-NN methods in that:
 They do not represent instances as real-valued points,
but instead, they use a rich symbolic representation.
 CBR can thus be applied to complex conceptual problems
such as the design of mechanical devices or legal
reasoning
12
Lazy versus Eager Learning
 Lazy methods: k-NN, locally weighted regression, CBR
 Eager methods: RBF + all the methods we studied in the
course so far.
 Differences in Computation Time:
 Lazy methods learn quickly but classify slowly
 Eager methods learn slowly but classify quickly
 Differences in Classification Approaches:
 Lazy methods search a larger hypothesis space than
eager methods because they use many different local
functions to form their implicit global approximation
to the target function. Eager methods commit at
training time to a single global approximation.

Más contenido relacionado

Similar a ML_Lecture_7.ppt

Investigating the Performance of Distanced-Based Weighted-Voting approaches i...
Investigating the Performance of Distanced-Based Weighted-Voting approaches i...Investigating the Performance of Distanced-Based Weighted-Voting approaches i...
Investigating the Performance of Distanced-Based Weighted-Voting approaches i...
Dario Panada
 
instance bases k nearest neighbor algorithm.ppt
instance bases k nearest neighbor algorithm.pptinstance bases k nearest neighbor algorithm.ppt
instance bases k nearest neighbor algorithm.ppt
Johny139575
 
Machine learning Module-2, 6th Semester Elective
Machine learning Module-2, 6th Semester ElectiveMachine learning Module-2, 6th Semester Elective
Machine learning Module-2, 6th Semester Elective
MayuraD1
 
Section5 Rbf
Section5 RbfSection5 Rbf
Section5 Rbf
kylin
 
Lecture 17: Supervised Learning Recap
Lecture 17: Supervised Learning RecapLecture 17: Supervised Learning Recap
Lecture 17: Supervised Learning Recap
butest
 

Similar a ML_Lecture_7.ppt (20)

Investigating the Performance of Distanced-Based Weighted-Voting approaches i...
Investigating the Performance of Distanced-Based Weighted-Voting approaches i...Investigating the Performance of Distanced-Based Weighted-Voting approaches i...
Investigating the Performance of Distanced-Based Weighted-Voting approaches i...
 
instance bases k nearest neighbor algorithm.ppt
instance bases k nearest neighbor algorithm.pptinstance bases k nearest neighbor algorithm.ppt
instance bases k nearest neighbor algorithm.ppt
 
Finite element analysis sdfa ggq rsd vqer fas dd sg fa sd qadas casdasc asdac...
Finite element analysis sdfa ggq rsd vqer fas dd sg fa sd qadas casdasc asdac...Finite element analysis sdfa ggq rsd vqer fas dd sg fa sd qadas casdasc asdac...
Finite element analysis sdfa ggq rsd vqer fas dd sg fa sd qadas casdasc asdac...
 
Clustering Using Shared Reference Points Algorithm Based On a Sound Data Model
Clustering Using Shared Reference Points Algorithm Based On a Sound Data ModelClustering Using Shared Reference Points Algorithm Based On a Sound Data Model
Clustering Using Shared Reference Points Algorithm Based On a Sound Data Model
 
Types of Machine Learnig Algorithms(CART, ID3)
Types of Machine Learnig Algorithms(CART, ID3)Types of Machine Learnig Algorithms(CART, ID3)
Types of Machine Learnig Algorithms(CART, ID3)
 
Anchor free object detection by deep learning
Anchor free object detection by deep learningAnchor free object detection by deep learning
Anchor free object detection by deep learning
 
CNN for modeling sentence
CNN for modeling sentenceCNN for modeling sentence
CNN for modeling sentence
 
Machine Learning Algorithms (Part 1)
Machine Learning Algorithms (Part 1)Machine Learning Algorithms (Part 1)
Machine Learning Algorithms (Part 1)
 
Kernel Estimation of Videodeblurringalgorithm and Motion Compensation of Resi...
Kernel Estimation of Videodeblurringalgorithm and Motion Compensation of Resi...Kernel Estimation of Videodeblurringalgorithm and Motion Compensation of Resi...
Kernel Estimation of Videodeblurringalgorithm and Motion Compensation of Resi...
 
convolutional_neural_networks in deep learning
convolutional_neural_networks in deep learningconvolutional_neural_networks in deep learning
convolutional_neural_networks in deep learning
 
Lecture: Interatomic Potentials Enabled by Machine Learning
Lecture: Interatomic Potentials Enabled by Machine LearningLecture: Interatomic Potentials Enabled by Machine Learning
Lecture: Interatomic Potentials Enabled by Machine Learning
 
17- Kernels and Clustering.pptx
17- Kernels and Clustering.pptx17- Kernels and Clustering.pptx
17- Kernels and Clustering.pptx
 
Machine learning Module-2, 6th Semester Elective
Machine learning Module-2, 6th Semester ElectiveMachine learning Module-2, 6th Semester Elective
Machine learning Module-2, 6th Semester Elective
 
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
 
Chapter 11 cluster advanced : web and text mining
Chapter 11 cluster advanced : web and text miningChapter 11 cluster advanced : web and text mining
Chapter 11 cluster advanced : web and text mining
 
Chapter 11 cluster advanced, Han & Kamber
Chapter 11 cluster advanced, Han & KamberChapter 11 cluster advanced, Han & Kamber
Chapter 11 cluster advanced, Han & Kamber
 
Training machine learning knn 2017
Training machine learning knn 2017Training machine learning knn 2017
Training machine learning knn 2017
 
Section5 Rbf
Section5 RbfSection5 Rbf
Section5 Rbf
 
deep CNN vs conventional ML
deep CNN vs conventional MLdeep CNN vs conventional ML
deep CNN vs conventional ML
 
Lecture 17: Supervised Learning Recap
Lecture 17: Supervised Learning RecapLecture 17: Supervised Learning Recap
Lecture 17: Supervised Learning Recap
 

Más de ssuserec53e73

Threats in network that can be noted in security
Threats in network that can be noted in securityThreats in network that can be noted in security
Threats in network that can be noted in security
ssuserec53e73
 
Lsn21_NumPy in data science using python
Lsn21_NumPy in data science using pythonLsn21_NumPy in data science using python
Lsn21_NumPy in data science using python
ssuserec53e73
 
OpenSecure socket layerin cyber security
OpenSecure socket layerin cyber securityOpenSecure socket layerin cyber security
OpenSecure socket layerin cyber security
ssuserec53e73
 
Hash functions, digital signatures and hmac
Hash functions, digital signatures and hmacHash functions, digital signatures and hmac
Hash functions, digital signatures and hmac
ssuserec53e73
 
50134147-Knowledge-Representation-Using-Rules.ppt
50134147-Knowledge-Representation-Using-Rules.ppt50134147-Knowledge-Representation-Using-Rules.ppt
50134147-Knowledge-Representation-Using-Rules.ppt
ssuserec53e73
 

Más de ssuserec53e73 (20)

Threats in network that can be noted in security
Threats in network that can be noted in securityThreats in network that can be noted in security
Threats in network that can be noted in security
 
Lsn21_NumPy in data science using python
Lsn21_NumPy in data science using pythonLsn21_NumPy in data science using python
Lsn21_NumPy in data science using python
 
OpenSecure socket layerin cyber security
OpenSecure socket layerin cyber securityOpenSecure socket layerin cyber security
OpenSecure socket layerin cyber security
 
Hash functions, digital signatures and hmac
Hash functions, digital signatures and hmacHash functions, digital signatures and hmac
Hash functions, digital signatures and hmac
 
Asian Elephant Adaptations - Chelsea P..pptx
Asian Elephant Adaptations - Chelsea P..pptxAsian Elephant Adaptations - Chelsea P..pptx
Asian Elephant Adaptations - Chelsea P..pptx
 
Module 10-Introduction to OOP.pptx
Module 10-Introduction to OOP.pptxModule 10-Introduction to OOP.pptx
Module 10-Introduction to OOP.pptx
 
unit-1-l3.ppt
unit-1-l3.pptunit-1-l3.ppt
unit-1-l3.ppt
 
AI.ppt
AI.pptAI.ppt
AI.ppt
 
50134147-Knowledge-Representation-Using-Rules.ppt
50134147-Knowledge-Representation-Using-Rules.ppt50134147-Knowledge-Representation-Using-Rules.ppt
50134147-Knowledge-Representation-Using-Rules.ppt
 
Dr Jose Reena K.pdf
Dr Jose Reena K.pdfDr Jose Reena K.pdf
Dr Jose Reena K.pdf
 
Enumeration.pptx
Enumeration.pptxEnumeration.pptx
Enumeration.pptx
 
footscan.PPT
footscan.PPTfootscan.PPT
footscan.PPT
 
UNIT II.pptx
UNIT II.pptxUNIT II.pptx
UNIT II.pptx
 
Unit 1 iot.pptx
Unit 1 iot.pptxUnit 1 iot.pptx
Unit 1 iot.pptx
 
IoT Reference Architecture.pptx
IoT Reference Architecture.pptxIoT Reference Architecture.pptx
IoT Reference Architecture.pptx
 
patent ppt.pptx
patent ppt.pptxpatent ppt.pptx
patent ppt.pptx
 
Introduction to measurement.pptx
Introduction to measurement.pptxIntroduction to measurement.pptx
Introduction to measurement.pptx
 
ML-DecisionTrees.ppt
ML-DecisionTrees.pptML-DecisionTrees.ppt
ML-DecisionTrees.ppt
 
070308-simmons.ppt
070308-simmons.ppt070308-simmons.ppt
070308-simmons.ppt
 
14_526_topic11.ppt
14_526_topic11.ppt14_526_topic11.ppt
14_526_topic11.ppt
 

Último

The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
KarakKing
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
AnaAcapella
 

Último (20)

The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
Spellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please PractiseSpellings Wk 3 English CAPS CARES Please Practise
Spellings Wk 3 English CAPS CARES Please Practise
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Dyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptxDyslexia AI Workshop for Slideshare.pptx
Dyslexia AI Workshop for Slideshare.pptx
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 

ML_Lecture_7.ppt

  • 1. 1 Machine Learning: Lecture 7 Instance-Based Learning (IBL) (Based on Chapter 8 of Mitchell T.., Machine Learning, 1997)
  • 2. 2 General Description  IBL methods learn by simply storing the presented training data.  When a new query instance is encountered, a set of similar related instances is retrieved from memory and used to classify the new query instance.  IBL approaches can construct a different approximation to the target function for each distinct query. They can construct local rather than global approximations.  IBL methods can use complex symbolic representations for instances. This is called Case-Based Reasoning (CBR).
  • 3. 3 Advantages and Disadvantages of IBL Methods Advantage: IBL Methods are particularly well suited to problems in which the target function is very complex, but can still be described by a collection of less complex local approximations. Disadvantage I: The cost of classifying new instances can be high (since most of the computation takes place at this stage). Disadvantage II: Many IBL approaches typically consider all attributes of the instances ==> they are very sensitive to the curse of dimensionality!
  • 4. 4 k-Nearest Neighbour Learning  Assumption: All instances, x, correspond to points in the n-dimensional space Rn. x =<a1(x), a2(x)…an(x)>.  Measure Used: Euclidean Distance: d(xi,xj)= r=1 n (ar(xi)-ar(xj))2  Training Algorithm:  For each training example <x,f(x)>, add the example to the list training_examples.  Classification Algorithm: Given a query instance xq to be classified: • Let x1…xk be the k instances from training_examples that are nearest to xq. • Return f^(xq) <- argmaxvVr=1 n (v,f(xi)) • where (a,b)=1 if a=b and (a,b)=0 otherwise.
  • 5. 5 Example + + - - - : query, xq 1-NN: + 5-NN: - Decision Surface for 1-NN
  • 6. 6 Distance-Weighted Nearest Neighbour  k-NN can be refined by weighing the contribution of the k neighbours according to their distance to the query point xq, giving greater weight to closer neighbours.  To do so, replace the last line of the algorithm with f^(xq) <- argmaxvVr=1 n wi(v,f(xi)) where wi=1/d(xq,xi)2
  • 7. 7 Remarks on k-NN  k-NN can be used for regression instead of classification.  k-NN is robust to noise and, it is generally quite a good classifier.  k-NN’s disadvantage is that it uses all attributes to classify instances  Solution 1: weigh the attributes differently (use cross-validation to determine the weights)  Solution 2: eliminate the least relevant attributes (again, use cross-validation to determine which attributes to eliminate)
  • 8. 8 Locally Weighted Regression  Locally weighted regression generalizes nearest-neighbour approaches by constructing an explicit approximation to f over a local region surrounding xq.  In such approaches, the contribution of each training example is weighted by its distance to the query point.
  • 9. 9 An Example: Locally Weighted Linear Regression  f is approximated by: f^(x)=w0+w1a1(x)+…+wnan(x)  Gradient descent can be used to find the coefficients w0, w1,…wn that minimize some error function.  The error function, however, should be different from the one used in the Neural Net since we want a local solution. Different possibilities:  Minimize the squared error over just the k nearest neighbours.  Minimize the squared error over the entire training set but weigh the contribution of each example by some decreasing function K of its distance from xq.  Combine 1 and 2
  • 10. 10 Radial Basis Function (RBF)  Approximating Function: f^(x)=w0+ u=1 k wu Ku(d(xu,x))  Ku(d(xu,x)) is a kernel function that decreases as the distance d(xu,x) increases (e.g., the Gaussian function); and k is a user-defined constant that specifies the number of kernel functions to be included.  Although f^(x) is a global approximation to f(x) the contribution of each kernel function is localized.  RBF can be implemented in a neural network. It is a very efficient two step algorithm: • Find the parameters of the kernel functions (e.g., use the EM algorithm) • Learn the linear weights of the kernel functions.
  • 11. 11 Case-Based Reasoning (CBR)  CBR is similar to k-NN methods in that:  They are lazy learning methods in that they defer generalization until a query comes around.  They classify new query instances by analyzing similar instances while ignoring instances that are very different from the query.  However, CBR is different from k-NN methods in that:  They do not represent instances as real-valued points, but instead, they use a rich symbolic representation.  CBR can thus be applied to complex conceptual problems such as the design of mechanical devices or legal reasoning
  • 12. 12 Lazy versus Eager Learning  Lazy methods: k-NN, locally weighted regression, CBR  Eager methods: RBF + all the methods we studied in the course so far.  Differences in Computation Time:  Lazy methods learn quickly but classify slowly  Eager methods learn slowly but classify quickly  Differences in Classification Approaches:  Lazy methods search a larger hypothesis space than eager methods because they use many different local functions to form their implicit global approximation to the target function. Eager methods commit at training time to a single global approximation.