Readings
Draft chapters will be
handed out in class from the following books:
- Introduction to Probabilistic Graphical Models , Michael Jordan
- Bayesian networks and Beyond, Daphne Koller and Nir Friedman
Extra copies are in the CS reading room.
If you take the last one (unstapled), please photocopy and return it.
These books are currently being revised for publication
and contain errors.
Please send me all your comments at the end of the semester
in one large text file and I will forward them to the authors.
Lecture 1 (13 Sep 04)
The following are optional background reading.
- Matlab tutorials
- Brief
introduction to graphical models, Kevin Murphy, web page
- Chapter 2 of Koller & Friedman (Foundations).
- Chapter 2 of Jordan (Conditional independence and factorization).
- Chapter 13 of Russell & Norvig (Uncertainty), "AI: A modern
approach", 2nd edition.
Lecture 2 (15 Sep 04)
- Chapter 3 of Koller & Friedman (The Bayesian network representation).
Read pp 67--101.
Lecture 3 (20 Sep 04)
- Chapter 5 of Koller & Friedman (Undirected graphical models).
Lecture 4 (22 Sep 04)
- Chapter 4 of Koller & Friedman (Local probabilistic models).
Lecture 5 (27 Sep 04)
- Chapter 6 of Koller & Friedman (Inference with graphical models)
- Chapter 7 of Koller & Friedman (Exact inference: variable elimination)
Lecture 6 (29 Sep 04)
Lecture 7 (4 Oct 04)
Lecture 8 (6 Oct 04)
No handouts
Lecture 9 (Mon 18 Oct)
- Chapter 12 of Koller & Friedman (Learning: introduction)
- Chapter 13 of Koller & Friedman (Parameter estimation)
Lecture 10 (Wed 20 Oct)
Lecture 11 (Mon 25 Oct)
Lecture 12 (Wed 27 Oct)
- Chapter 10 of Jordan (mixtures and conditional mixtures)
- Chapter 11 of Jordan (EM)
- Chapter 12 of Jordan (HMMs)
Lecture 13 (Mon 1 Nov)
No handouts.
Lecture 14 (Wed 3 Nov)
- Chapter 19 of Jordan (Features, maxent)
- Chapter 20 of Jordan (iterative scaling)
- Chapter 14 of Koller & Friedman (structure learning)
Lecture 15 (Mon 8 Nov)
- Optional:
Brief
introduction to Bayesian machine learning, Z. Ghahramani
- Optional.
For a description of the debate between Bayesians and frequentists see
Chapter
37 of
David MacKay's excellent
textbook.
- Optional: From P Values to Bayesian Statistics,
It's All in the Numbers, The Scientist, Apr. 26, 2004
- P values: what they are and what
they are not, Mark Schervish, The American Statistician, 1996
- Reading list
on Bayesian methods by Tom Griffiths
- Pathologies
of orthodox statistics, Tom Minka tech report 2001
-
Nuances of probability theory, Tom Minka tech report 2001
- Bayesian
jokes
Lecture 16 (Wed 10 Nov)
Lecture 17 (Wed 17 Nov)
- Jordan ch 13 (Multivariate Gaussian)
- Jordan ch 14 (Factor analysis)
- Jordan ch 15 (Kalman filtering and smoothing)
-
A technique for painless derivation of kalman filtering
recursions, A. Cemgil, Tech Report, U. Nijmegen, 2001
-
From Hidden Markov Models to Linear Dynamical Systems,
MIT Media Lab Tech report 531, Tom Minka, 1999
-
A Unifying Review of Linear Gaussian Models,
Sam Roweis & Zoubin Ghahramani.
Neural Computation 11(2) (1999) pp.305-345
-
An introduction to factor graphs,
H.-A. Loeliger,
IEEE Signal Proc. Mag, Jan. 2004, pp. 28-41.
- Least
squares and Kalman filtering on Forney graphs,
H.-A. Loeliger,
in Codes, Graphs, and Systems, (festschrift in honour of David
Forney on the occasion of his 60th birthday), R.E. Blahut and
R. Koetter, eds., Kluwer, 2002, pp. 113-135.
Lecture 19 (Wed 24 Nov)
Lecture 20 (Mon 29 Nov)
Applications
Molecular Biology
Computer vision
Human vision
-
Kersten, D., Mamassian, P., & Yuille, A. (2004). Object perception as
Bayesian Inference. Annual Review of Psychology, 55, 271-304.
- Kersten,
D., & Yuille, A. (2003). Bayesian models of object perception. Current
Opinion in Neurobiology, 13(2)
Legal reasoning
Popular press
Kevin Murphy
Last modified: Thu Sep 9 16:53:30 PDT 2004