[关闭]
@nanmeng 2016-06-22T04:35:48.000000Z 字数 2499 阅读 1173

Probabilistic Graphical Models(CMU)-4

Probabilistic_Graphical_Models CMU notes


The class link: Probabilistic Graphical Models(Spring 2014) - Eric Xing

Lecture notes 4

Notice: we both need Markov Random Field and Bayesian Network to represent different subset of distribution.

Partially Directed Acyclic Graphs

 NAlso called chain graphs
 Nodes can be disjointly partitioned into several chain components
 An edge within the same chain component must be undirected
 An edge between two nodes in different chain components must be
directed
PGM4_1
PGM4_2

Typical tasks:

Approaches to inference

Approximate inference techniques

The basic algorithm for inference is elimination algorithm.
PGM4_3
PGM4_4

Complexity:

And we also want to whether this idea can extend to some other cases ......

First, Hidden Markov Model
PGM4_5

where

Likewise, we further eliminate by

And amazingly this is the raw idea of most famous algorithm: forward-backward algorithm(derived from the elimination algorithm!!!).

This is the backward information...
PGM4_6

And this elimination algorithm also can apply to resolve
Conditional Random Fields:
PGM4_7

Summation

In general, we can view the task at hand as that of computing
the value of an expression of the form:


where is a set of factors
We call this task the sum-product inference task.

General idea:
Write query in the form

this suggests an "elimination order" of latent variables to be
marginalized


this suggests an "elimination order" of latent variables to be
marginalized

Iteratively

The elimination algorithm

PGM4_8
PGM4_9
PGM4_10
pgm4_11

An example:

PGM4_12
PGM4_13

Understanding Variable Elimination

PGM4_14

Elimination Cliques

PGM4_15

Graph elimination and marginalization

PGM4_16

A Clique tree

PGM4_17

Complexity

PGM4_18

--

添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注