SlideShare una empresa de Scribd logo
1 de 25
INFERENCE IN HMM AND BAYESIAN NETWORKS
DR MINAKSHI PRADEEP ATRE
PVG’S COET, PUNE
REFERENCES
 Journal paper, titled. “An Introduction of Hidden Markov Model and Bayesian Network by
Zoubain Ghahramani (2001)
 https://www.cs.ubc.ca/~murphyk/Bayes/bnintro.html
 https://www.youtube.com/watch?v=kqSzLo9fenk (Bayes Theorem and HMM)
 https://www.youtube.com/watch?v=YlL0YARYK-o (HMM)
 https://www.youtube.com/watch?v=EqUfuT3CC8s (Markov models)
 https://www.youtube.com/watch?v=5araDjcBHMQ (maths for Markov Model & time series)
 https://aimacode.github.io/aima-exercises/
 https://aimacode.github.io/aima-exercises/bayesian-learning-exercises/
 15-381: Artificial Intelligence (ppt)
WHY WE NEED PROBABILISTIC MODELS?
Probabilist
ic Models
Inferences
Unable to handle:
1) uncertain knowledge
and 2) probabilistic
reasoning
FOL
ONTOLOGY
 a set of concepts and categories in a subject area or
domain
 that shows their properties and the relations between
them
WHY WE NEED PROBABILISTIC MODELS?
 The FOL is used for inference
 There were 2 methods:
 Automated inference
 Forward checking
 Backward checking
 Resolution refutation
Propositional logic
has a very limited
ontology, making
only the
commitment that
the world consists
of facts
First order logic
overcomes the
limitation of this
ontology and has
different inference
methods
Objects
Propertie
s
Relations
Unable to handle:
uncertain
knowledge and
probabilistic
reasoning
CONTENTS : PROBABILISTIC MODELS FOR INFERENCE
 Learning and inference in hidden
Markov Model (HMM) in context with
Bayesian network
 Uncertainty and methods,
 Bayesian Probability and Belief network,
 probabilistic Reasoning,
 Generative models: Bayesian networks,
 inferences in Bayesian networks,
 Temporal models: Hidden Markov
models
Represents knowledge
and data as a fixed set
of random variables
with a joint probability
distribution
UNCERTAINTY AND METHODS
BAYESIAN PROBABILITY
 Solved examples on Bayesian Probability
SUM OF BASIC PROBABILITY FORMULAE
LIMITATIONS OF BAYESIAN
 What’s wrong with Bayesian networks
 Bayesian networks are very useful for modeling joint distributions
 • But they have their limitations:
 - Cannot account for temporal / sequence models
 - DAG’s (no self or any other loops)
CONCLUSION ( BAYESIAN LEARNING METHODS)
 Bayesian learning methods are firmly based on probability theory
and exploit advanced methods developed in statistics.
 Naïve Bayes is a simple generative model that works fairly well in
practice.
 A Bayesian network allows specifying a limited set of dependencies
using a directed graph.
 Inference algorithms allow determining the probability of values
for query variables given values for evidence variables.
WHAT ARE BELIEF NETWORKS?
 Def = efficient reasoning with probability is so new that there is one main approach—belief networks
 Conditional independence information is a vital and robust way to structure information about an
uncertain domain
 Belief networks are a natural way to represent conditional independence information
 The links between nodes represent the qualitative aspects of the domain, and the conditional probability
tables represent the quantitative aspects
 A belief network is a complete representation for the joint probability distribution for the domain, but is
often exponentially smaller in size
 Inference in belief networks means computing the probability distribution of a set of query variables,
given a set of evidence variables.
 Belief networks can reason causally, diagnostically, in mixed mode, or intercausally.
 No other uncertain reasoning mechanism can handle all these modes
 The complexity of belief network inference depends on the network structure
 In polytrees (singly connected networks), the computation time is linear in the size of the network
 There are various inference techniques for general belief networks, all of which have exponential complexity in
the worst case.
 In real domains, the local structure tends to make things more feasible, but care is needed to construct a
tractable network with more than a hundred nodes
 It is also possible to use approximation techniques, including stochastic simulation, to get an estimate of the
true probabilities with less computation
 Various alternative systems for reasoning with uncertainty have been suggested. All the truth-functional
systems have serious problems with mixed or intercausal reasoning
 In the context of using Bayes' rule, conditional independence relationships among variables can simplify
the computation of query results and greatly reduce the number of conditional probabilities that need
to be specified.
 We use a data structure called a belief network' to represent the dependence between variables and to
give a concise specification of the joint probability distribution.
 A belief network is a graph in which the following holds:
 1. A set of random variables makes up the nodes of the network.
 2. A set of directed links or arrows connects pairs of nodes. The intuitive meaning of an arrow from node X to
node Y is that X has a direct influence on Y.
 3. Each node has a conditional probability table that quantifies the effects that the parents have on the node. The
parents of a node are all those nodes that have arrows pointing to it.
 4. The graph has no directed cycles (hence is a directed, acyclic graph, or DAG).
A TYPICAL BELIEF NETWORK
 Consider the following situation.
 You have a new burglar alarm installed at home.
 It is fairly reliable at detecting a burglary, but also responds
on occasion to minor earthquakes. (This example is due to
Judea Pearl, a resident of Los Angeles; hence the acute
interest in earthquakes.)
 You also have two neighbors, John and Mary, who have
promised to call you at work when they hear the alarm.
 John always calls when he hears the alarm, but sometimes
confuses the telephone ringing with the alarm and calls
then, too.
 Mary, on the other hand, likes rather loud music and
sometimes misses the alarm altogether.
 Given the evidence of who has or has not called, we would
like to estimate the probability of a burglary.
 This simple domain is described by the belief network in
Figure
DISCUSSION (PAGE 456 RUSSELL & NORVIG)
 The topology of the network can be thought of as an abstract knowledge
base that holds in a wide variety of different settings, because it represents
the general structure of the causal processes in the domain rather than any
details of the population of individuals.
 In the case of the burglary network, the topology shows that burglary and
earthquakes directly affect the probability of the alarm going off, but whether
or not John and Mary call depends only on the alarm—the network thus
represents our assumption that they do not perceive any burglaries directly,
and they do not feel the minor earthquakes.
BELIEF NETWORKS : LEARNING IN BELIEF NETWORKS
 There are four kinds of belief networks, depending upon whether the structure of the network is known or
unknown, and whether the variables in the network are observable or hidden.
 known structure, fully observable -- In this case the only learnable part is the conditional probability tables.
These can be estimated directly using the statistics of the sample data set.
 unknown structure, fully observable -- Here the problem is to reconstruct the network topology. The
problem can be thought of as a search through structure space, and fitting data to each structure reduces to
the fixed-structure problem, so the MAP or ML probability value can be used as a heuristic in hill-climbing or
SA search.
 known structure, hidden variables -- This is analagous to neural network learning.
 unknown structure, hidden variables -- When some variables are unobservable, it becomes difficult to apply
prior techniques for recovering structure, but they require averaging over all possible values of the unknown
variables. No good general algorithms are known for handling this case.
HMM: TEMPORAL MODELS
Inference
has 3
factors
filtering
prediction
smoothing
HMM: TEMPORAL MODELS
 Inference in temporal model :
 filtering
 prediction
 smoothing
INFERENCE: MATHEMATICAL REPRESENTATION
MARKOV CHAIN
MARKOV CHAIN & EXAMPLE
WHAT’S NEXT : UNIT 4 LEARNING
LEARNING ALGORITHMS
1
Bayesian (naïve Bayes)
2
Decision tree
3
Neural Networks
THANK YOU

Más contenido relacionado

La actualidad más candente

Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
Hưng Đặng
 

La actualidad más candente (9)

Mining co-expression network
Mining co-expression networkMining co-expression network
Mining co-expression network
 
Jack
JackJack
Jack
 
project_presentation
project_presentationproject_presentation
project_presentation
 
Joint gene network inference with multiple samples: a bootstrapped consensual...
Joint gene network inference with multiple samples: a bootstrapped consensual...Joint gene network inference with multiple samples: a bootstrapped consensual...
Joint gene network inference with multiple samples: a bootstrapped consensual...
 
Network analysis for computational biology
Network analysis for computational biologyNetwork analysis for computational biology
Network analysis for computational biology
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
A SUPERPROCESS WITH UPPER CONFIDENCE BOUNDS FOR COOPERATIVE SPECTRUM SHARING
A SUPERPROCESS WITH UPPER CONFIDENCE BOUNDS  FOR COOPERATIVE SPECTRUM SHARINGA SUPERPROCESS WITH UPPER CONFIDENCE BOUNDS  FOR COOPERATIVE SPECTRUM SHARING
A SUPERPROCESS WITH UPPER CONFIDENCE BOUNDS FOR COOPERATIVE SPECTRUM SHARING
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
Validation of Wireless and Mobile Network Models and Simulation Paper Review
Validation of Wireless and Mobile Network Models and Simulation Paper ReviewValidation of Wireless and Mobile Network Models and Simulation Paper Review
Validation of Wireless and Mobile Network Models and Simulation Paper Review
 

Similar a Inference in HMM and Bayesian Models

NIPS2007: deep belief nets
NIPS2007: deep belief netsNIPS2007: deep belief nets
NIPS2007: deep belief nets
zukun
 
machinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdfmachinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdf
ShivareddyGangam
 
Topology ppt
Topology pptTopology ppt
Topology ppt
boocse11
 

Similar a Inference in HMM and Bayesian Models (20)

NIPS2007: deep belief nets
NIPS2007: deep belief netsNIPS2007: deep belief nets
NIPS2007: deep belief nets
 
712201907
712201907712201907
712201907
 
Bayesian Networks and Association Analysis
Bayesian Networks and Association AnalysisBayesian Networks and Association Analysis
Bayesian Networks and Association Analysis
 
Introduction Of Artificial neural network
Introduction Of Artificial neural networkIntroduction Of Artificial neural network
Introduction Of Artificial neural network
 
Quantum neural network
Quantum neural networkQuantum neural network
Quantum neural network
 
Modeling the pairwise key predistribution scheme in the presence of unreliabl...
Modeling the pairwise key predistribution scheme in the presence of unreliabl...Modeling the pairwise key predistribution scheme in the presence of unreliabl...
Modeling the pairwise key predistribution scheme in the presence of unreliabl...
 
An information-theoretic, all-scales approach to comparing networks
An information-theoretic, all-scales approach to comparing networksAn information-theoretic, all-scales approach to comparing networks
An information-theoretic, all-scales approach to comparing networks
 
Ijciet 10 01_153-2
Ijciet 10 01_153-2Ijciet 10 01_153-2
Ijciet 10 01_153-2
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
machinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdfmachinelearningengineeringslideshare-160909192132 (1).pdf
machinelearningengineeringslideshare-160909192132 (1).pdf
 
Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 
Expandable bayesian
Expandable bayesianExpandable bayesian
Expandable bayesian
 
Topology ppt
Topology pptTopology ppt
Topology ppt
 
Applications in Machine Learning
Applications in Machine LearningApplications in Machine Learning
Applications in Machine Learning
 
Topology ppt
Topology pptTopology ppt
Topology ppt
 
Topology ppt
Topology pptTopology ppt
Topology ppt
 
What Is a Neural Network
What Is a Neural NetworkWhat Is a Neural Network
What Is a Neural Network
 
A SELF-ORGANIZING RECURRENT NEURAL NETWORK
A SELF-ORGANIZING RECURRENT NEURAL NETWORKA SELF-ORGANIZING RECURRENT NEURAL NETWORK
A SELF-ORGANIZING RECURRENT NEURAL NETWORK
 

Más de Minakshi Atre

Más de Minakshi Atre (20)

Part1 speech basics
Part1 speech basicsPart1 speech basics
Part1 speech basics
 
Signals&Systems: Quick pointers to Fundamentals
Signals&Systems: Quick pointers to FundamentalsSignals&Systems: Quick pointers to Fundamentals
Signals&Systems: Quick pointers to Fundamentals
 
Unit 4 Statistical Learning Methods: EM algorithm
Unit 4 Statistical Learning Methods: EM algorithmUnit 4 Statistical Learning Methods: EM algorithm
Unit 4 Statistical Learning Methods: EM algorithm
 
Artificial Intelligence: Basic Terminologies
Artificial Intelligence: Basic TerminologiesArtificial Intelligence: Basic Terminologies
Artificial Intelligence: Basic Terminologies
 
2)local search algorithms
2)local search algorithms2)local search algorithms
2)local search algorithms
 
Performance appraisal/ assessment in higher educational institutes (HEI)
Performance appraisal/ assessment in higher educational institutes (HEI)Performance appraisal/ assessment in higher educational institutes (HEI)
Performance appraisal/ assessment in higher educational institutes (HEI)
 
DSP preliminaries
DSP preliminariesDSP preliminaries
DSP preliminaries
 
Artificial intelligence agents and environment
Artificial intelligence agents and environmentArtificial intelligence agents and environment
Artificial intelligence agents and environment
 
Unit 6: DSP applications
Unit 6: DSP applications Unit 6: DSP applications
Unit 6: DSP applications
 
Unit 6: DSP applications
Unit 6: DSP applicationsUnit 6: DSP applications
Unit 6: DSP applications
 
Learning occam razor
Learning occam razorLearning occam razor
Learning occam razor
 
Learning in AI
Learning in AILearning in AI
Learning in AI
 
Waltz algorithm in artificial intelligence
Waltz algorithm in artificial intelligenceWaltz algorithm in artificial intelligence
Waltz algorithm in artificial intelligence
 
Perception in artificial intelligence
Perception in artificial intelligencePerception in artificial intelligence
Perception in artificial intelligence
 
Popular search algorithms
Popular search algorithmsPopular search algorithms
Popular search algorithms
 
Artificial Intelligence Terminologies
Artificial Intelligence TerminologiesArtificial Intelligence Terminologies
Artificial Intelligence Terminologies
 
composite video signal
composite video signalcomposite video signal
composite video signal
 
Basic terminologies of television
Basic terminologies of televisionBasic terminologies of television
Basic terminologies of television
 
Mpeg 2
Mpeg 2Mpeg 2
Mpeg 2
 
Beginning of dtv
Beginning of dtvBeginning of dtv
Beginning of dtv
 

Último

Hospital management system project report.pdf
Hospital management system project report.pdfHospital management system project report.pdf
Hospital management system project report.pdf
Kamal Acharya
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
Epec Engineered Technologies
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
mphochane1998
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Kandungan 087776558899
 

Último (20)

Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptx
 
Hospital management system project report.pdf
Hospital management system project report.pdfHospital management system project report.pdf
Hospital management system project report.pdf
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and properties
 
Wadi Rum luxhotel lodge Analysis case study.pptx
Wadi Rum luxhotel lodge Analysis case study.pptxWadi Rum luxhotel lodge Analysis case study.pptx
Wadi Rum luxhotel lodge Analysis case study.pptx
 
Computer Lecture 01.pptxIntroduction to Computers
Computer Lecture 01.pptxIntroduction to ComputersComputer Lecture 01.pptxIntroduction to Computers
Computer Lecture 01.pptxIntroduction to Computers
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
 
Employee leave management system project.
Employee leave management system project.Employee leave management system project.
Employee leave management system project.
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdf
 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.ppt
 
A Study of Urban Area Plan for Pabna Municipality
A Study of Urban Area Plan for Pabna MunicipalityA Study of Urban Area Plan for Pabna Municipality
A Study of Urban Area Plan for Pabna Municipality
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS Lambda
 
Hostel management system project report..pdf
Hostel management system project report..pdfHostel management system project report..pdf
Hostel management system project report..pdf
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the start
 
Unleashing the Power of the SORA AI lastest leap
Unleashing the Power of the SORA AI lastest leapUnleashing the Power of the SORA AI lastest leap
Unleashing the Power of the SORA AI lastest leap
 
Block diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.pptBlock diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.ppt
 
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptxOrlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
 

Inference in HMM and Bayesian Models

  • 1. INFERENCE IN HMM AND BAYESIAN NETWORKS DR MINAKSHI PRADEEP ATRE PVG’S COET, PUNE
  • 2. REFERENCES  Journal paper, titled. “An Introduction of Hidden Markov Model and Bayesian Network by Zoubain Ghahramani (2001)  https://www.cs.ubc.ca/~murphyk/Bayes/bnintro.html  https://www.youtube.com/watch?v=kqSzLo9fenk (Bayes Theorem and HMM)  https://www.youtube.com/watch?v=YlL0YARYK-o (HMM)  https://www.youtube.com/watch?v=EqUfuT3CC8s (Markov models)  https://www.youtube.com/watch?v=5araDjcBHMQ (maths for Markov Model & time series)  https://aimacode.github.io/aima-exercises/  https://aimacode.github.io/aima-exercises/bayesian-learning-exercises/  15-381: Artificial Intelligence (ppt)
  • 3. WHY WE NEED PROBABILISTIC MODELS? Probabilist ic Models Inferences Unable to handle: 1) uncertain knowledge and 2) probabilistic reasoning FOL
  • 4. ONTOLOGY  a set of concepts and categories in a subject area or domain  that shows their properties and the relations between them
  • 5. WHY WE NEED PROBABILISTIC MODELS?  The FOL is used for inference  There were 2 methods:  Automated inference  Forward checking  Backward checking  Resolution refutation Propositional logic has a very limited ontology, making only the commitment that the world consists of facts First order logic overcomes the limitation of this ontology and has different inference methods Objects Propertie s Relations Unable to handle: uncertain knowledge and probabilistic reasoning
  • 6. CONTENTS : PROBABILISTIC MODELS FOR INFERENCE  Learning and inference in hidden Markov Model (HMM) in context with Bayesian network  Uncertainty and methods,  Bayesian Probability and Belief network,  probabilistic Reasoning,  Generative models: Bayesian networks,  inferences in Bayesian networks,  Temporal models: Hidden Markov models Represents knowledge and data as a fixed set of random variables with a joint probability distribution
  • 8. BAYESIAN PROBABILITY  Solved examples on Bayesian Probability
  • 9. SUM OF BASIC PROBABILITY FORMULAE
  • 10. LIMITATIONS OF BAYESIAN  What’s wrong with Bayesian networks  Bayesian networks are very useful for modeling joint distributions  • But they have their limitations:  - Cannot account for temporal / sequence models  - DAG’s (no self or any other loops)
  • 11. CONCLUSION ( BAYESIAN LEARNING METHODS)  Bayesian learning methods are firmly based on probability theory and exploit advanced methods developed in statistics.  Naïve Bayes is a simple generative model that works fairly well in practice.  A Bayesian network allows specifying a limited set of dependencies using a directed graph.  Inference algorithms allow determining the probability of values for query variables given values for evidence variables.
  • 12. WHAT ARE BELIEF NETWORKS?  Def = efficient reasoning with probability is so new that there is one main approach—belief networks  Conditional independence information is a vital and robust way to structure information about an uncertain domain  Belief networks are a natural way to represent conditional independence information  The links between nodes represent the qualitative aspects of the domain, and the conditional probability tables represent the quantitative aspects  A belief network is a complete representation for the joint probability distribution for the domain, but is often exponentially smaller in size  Inference in belief networks means computing the probability distribution of a set of query variables, given a set of evidence variables.
  • 13.  Belief networks can reason causally, diagnostically, in mixed mode, or intercausally.  No other uncertain reasoning mechanism can handle all these modes  The complexity of belief network inference depends on the network structure  In polytrees (singly connected networks), the computation time is linear in the size of the network  There are various inference techniques for general belief networks, all of which have exponential complexity in the worst case.  In real domains, the local structure tends to make things more feasible, but care is needed to construct a tractable network with more than a hundred nodes  It is also possible to use approximation techniques, including stochastic simulation, to get an estimate of the true probabilities with less computation  Various alternative systems for reasoning with uncertainty have been suggested. All the truth-functional systems have serious problems with mixed or intercausal reasoning
  • 14.  In the context of using Bayes' rule, conditional independence relationships among variables can simplify the computation of query results and greatly reduce the number of conditional probabilities that need to be specified.  We use a data structure called a belief network' to represent the dependence between variables and to give a concise specification of the joint probability distribution.  A belief network is a graph in which the following holds:  1. A set of random variables makes up the nodes of the network.  2. A set of directed links or arrows connects pairs of nodes. The intuitive meaning of an arrow from node X to node Y is that X has a direct influence on Y.  3. Each node has a conditional probability table that quantifies the effects that the parents have on the node. The parents of a node are all those nodes that have arrows pointing to it.  4. The graph has no directed cycles (hence is a directed, acyclic graph, or DAG).
  • 15. A TYPICAL BELIEF NETWORK  Consider the following situation.  You have a new burglar alarm installed at home.  It is fairly reliable at detecting a burglary, but also responds on occasion to minor earthquakes. (This example is due to Judea Pearl, a resident of Los Angeles; hence the acute interest in earthquakes.)  You also have two neighbors, John and Mary, who have promised to call you at work when they hear the alarm.  John always calls when he hears the alarm, but sometimes confuses the telephone ringing with the alarm and calls then, too.  Mary, on the other hand, likes rather loud music and sometimes misses the alarm altogether.  Given the evidence of who has or has not called, we would like to estimate the probability of a burglary.  This simple domain is described by the belief network in Figure
  • 16. DISCUSSION (PAGE 456 RUSSELL & NORVIG)  The topology of the network can be thought of as an abstract knowledge base that holds in a wide variety of different settings, because it represents the general structure of the causal processes in the domain rather than any details of the population of individuals.  In the case of the burglary network, the topology shows that burglary and earthquakes directly affect the probability of the alarm going off, but whether or not John and Mary call depends only on the alarm—the network thus represents our assumption that they do not perceive any burglaries directly, and they do not feel the minor earthquakes.
  • 17. BELIEF NETWORKS : LEARNING IN BELIEF NETWORKS  There are four kinds of belief networks, depending upon whether the structure of the network is known or unknown, and whether the variables in the network are observable or hidden.  known structure, fully observable -- In this case the only learnable part is the conditional probability tables. These can be estimated directly using the statistics of the sample data set.  unknown structure, fully observable -- Here the problem is to reconstruct the network topology. The problem can be thought of as a search through structure space, and fitting data to each structure reduces to the fixed-structure problem, so the MAP or ML probability value can be used as a heuristic in hill-climbing or SA search.  known structure, hidden variables -- This is analagous to neural network learning.  unknown structure, hidden variables -- When some variables are unobservable, it becomes difficult to apply prior techniques for recovering structure, but they require averaging over all possible values of the unknown variables. No good general algorithms are known for handling this case.
  • 18. HMM: TEMPORAL MODELS Inference has 3 factors filtering prediction smoothing
  • 19. HMM: TEMPORAL MODELS  Inference in temporal model :  filtering  prediction  smoothing
  • 22. MARKOV CHAIN & EXAMPLE
  • 23. WHAT’S NEXT : UNIT 4 LEARNING
  • 24. LEARNING ALGORITHMS 1 Bayesian (naïve Bayes) 2 Decision tree 3 Neural Networks