Navi Mumbai Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Implementation of Variational Inference for Non-Parametric Hidden Markov Models
1. Probabilistic Programming of
Non-Parametric Bayesian
Dynamic Discrete State Models
University of Southampton
Southampton, UK
14th Oct 2013
James McInerney
jem1c10@ecs.soton.ac.uk
2. Applications
Across domains:
– Exploration (visual summary of large amount of data;
answer “what if?” by modifying parameters)
– Inference (understand hidden structure of data; fill in
missing data)
– Prediction
2
3. Discrete State Models
– Each observation explained by a latent state
-e.g., GPS observation at (50.931157, -1.401897) explained by me
being at “home”
– Mixture model
-e.g., my next location is independent of previous locations
– Hidden Markov model
-e.g., my next location depends on previous location (first-order)
3
4. Non-parametric Bayes
What number of states to specify?
– Traditional solution: model selection/averaging
– Have to consider many different numbers of states: slow
4
5. Non-parametric Bayes
What number of states to specify?
– Traditional solution: model selection/averaging
– Have to consider many different numbers of states: slow
– More elegant: Dirichlet process (non-parametric Bayes)
5
6. Example Discrete State Models
Example problems potentially solvable with NP-HMMs:
- You have appliance usage data for a home, and want to
predict if and when appliances will be used the next day
- You have location data of teams in AtomicOrchid and want
to find team assignments based on proximity (dealing with
noise of GPS and ephemeral proximity)
- You have user activity on a website or application and want
to infer their state of mind and predict future actions
6
7. Probabilistic Programming
– Specify model using domain specific language
– Benefit: run inference of unknown model parameters with a
click of a button
– E.g., Infer.net, Church, Stan, Alchemy
7
8. Probabilistic Programming
Limitations (of Infer.net):
– Does not handle Dirichlet process models (non-parametric
Bayes for discrete states)
– Very limited handling of HMMs
8
9. My Limited, Small Scale Answer
– Probabilistic programming for non-parametric discrete state
models (HMMs, mixture models) in Python
– User can specify any number of sensors on data:
- Multivariate Gaussian sensor (any # dimensions)
- Discrete sensor
- von Mises (periodic data)
- mixture of Gaussians
– Implemented using variational approximation (= fast)
9
10. Define Your Own Sensor
Implement two methods:
class Sensor(object):
def __init__(self,K,hyperparams):
self._K = K #truncation parameter
self._hyperparams = hyperparams
def loglik(self,X):
#given data set X, provide the likelihood
#of each (N) data point.
#returns: (N,K) matrix, with unnormalised
#log liklihood for each component and each data point
def m(self,X,exp_z):
#given expected value of z, calculate the
#variational parameters of each component (w.r.t. this sensor)
10
11. Live Demo
4 lines of code to run inference on a custom, multi-modal,
non-parametric HMM:
K = <truncation parameter>
gSensor = sensors.MVGaussianSensor(K,XDim)
dSensor = sensors.DiscreteSensor(K)
exp_z,_,exp_a,Zmax = general_inf.infer(N,[X,Y],K,[gSensor,dSensor])
11