@nanmeng
2016-06-22T04:35:48.000000Z
字数 2499
阅读 1173
Probabilistic_Graphical_Models
CMU
notes
The class link: Probabilistic Graphical Models(Spring 2014) - Eric Xing
Notice: we both need Markov Random Field and Bayesian Network to represent different subset of distribution.
NAlso called chain graphs
Nodes can be disjointly partitioned into several chain components
An edge within the same chain component must be undirected
An edge between two nodes in different chain components must be
directed
Typical tasks:
The basic algorithm for inference is elimination algorithm.
And we also want to whether this idea can extend to some other cases ......
First, Hidden Markov Model
where
Likewise, we further eliminate by
And amazingly this is the raw idea of most famous algorithm: forward-backward algorithm(derived from the elimination algorithm!!!).
This is the backward information...
And this elimination algorithm also can apply to resolve
Conditional Random Fields:
In general, we can view the task at hand as that of computing
the value of an expression of the form:
General idea:
Write query in the form
this suggests an "elimination order" of latent variables to be
marginalized
Iteratively
--