MLRG/fall08

From ResearchWiki

(Difference between revisions)
Jump to: navigation, search
(Participants)
m
Line 55: Line 55:
* [http://www.cs.utah.edu/~suresh Suresh Venkatasubramanian], Assistant Professor, School of Computing
* [http://www.cs.utah.edu/~suresh Suresh Venkatasubramanian], Assistant Professor, School of Computing
* [http://www.cs.utah.edu/~sidd Siddharth Patwardhan], PhD Student, School of Computing
* [http://www.cs.utah.edu/~sidd Siddharth Patwardhan], PhD Student, School of Computing
-
* John Moeller, Non-Matriculated Student, School of Computing
+
* [mailto:ruihong.huang@utah.edu Ruihong Huang], PhD Student, School of Computing
 +
* [mailto:john.moeller@utah.edu John Moeller], Non-Matriculated Student, School of Computing
-
==Schedule==
+
==Schedule and Readings==
-
{| border="1" style="width: 100%; text-align:left" class="content"
+
Most of our initial readings will be from the technical report [http://www.eecs.berkeley.edu/~wainwrig/Papers/WaiJorVariational03.pdf Graphical models, exponential families, and variational inference] by Martin Wainwright and Mike Jordan. I know this tome is a bit daunting, but it's not as bad as it looks at first glance!  We refer to this in the readings as WJ.
 +
 
 +
{| border="1" style="width: 100%; text-align:left" class="content"
|-
|-
! Date || Papers || Presenters
! Date || Papers || Presenters

Revision as of 23:03, 21 August 2008

Contents

CS7941: Inference in Graphical Models

Time: Fri 2:00-3:20pm (see schedule for each week)

Location: MEB 3105

Abstract

Graphical models (both Bayesian networks and Markov random fields) are a convenient method for encoding and reasoning about probability distributions. This seminar will focus initially on representational issues and then move to computational methods for inference in graphical models. Our interest will primarily be on approximate inference, since exact inference in most models is intractable. The focus will be on non-Bayesian approaches, though many things we discuss are applicable to Bayesian problems as well.

Students with a working knowledge of probability are well suited for this seminar. Each student should expect to present at least one week during the semester.

The rough plan is to begin with classical approaches to graphical models, including exact inference via the junction tree algorithm. We will quickly move on to approximate methods, focusing mostly on message passing and linear programming approaches.

The structure of the semester will be roughly as follows:

  • Introduction to Bayes' nets and graphical models: semantics, variable elimination, sum/max product algorithm, and factor graphs
  • Computational complexity: junction trees, triangulations and tree-width, belief propagation
  • Exponential families: maximum entropy principle, cumulant generating functions, conjugate duality
  • Variational inference: exact inference on trees, mean-field and structured mean-field, inner bounds of the marginal polytope
  • Statistical physics: Bethe free energy, log-determinant relaxation and outer bounds of the marginal polytope
  • Linear programming methods: tree-reweighted message passing, more outer bounds

This is a lot of material. Things will be dropped or skimmed depending on time and interest levels. Students should only come if they are prepared to do the required reading before the seminar.

Expected Student Involvement

Presentations will be given by pairs of students. The logistics of this will depend somewhat on the make-up of the participants, but the goal would be to pair CS students with non-CS students in most cases. Sign-ups will happen quickly, so please take care to make sure you're signed up for at least one week, though probably two. The expectations, with timing, are:

  • At least 3 days before a presentation, the presenters should meet with Hal for one hour to discuss the material and decide what to present.
  • By class time, everyone should have submitted at least 2 questions to the Wiki for discussion.
  • Everyone should actively participate in the discussions.

Participants

Participants: please add your name, position, department and (if relevant) a link to your web page or email address.

Schedule and Readings

Most of our initial readings will be from the technical report Graphical models, exponential families, and variational inference by Martin Wainwright and Mike Jordan. I know this tome is a bit daunting, but it's not as bad as it looks at first glance! We refer to this in the readings as WJ.

Date Papers Presenters
Introduction to Bayes' nets and graphical models
F 29 Aug Introduction, semantics and applications (WJ, 1-2.4) Hal
F 5 Sep Factor graphs, sum-product and max-product (WJ, 2.5-2.5.1; KFL01, 1,2,5) Piyush
F 12 Sep Junction tree algorithm and triangulations (WJ, 2.5.2; HD96, 4) Nathan
Variational Inference
F 19 Sep Math camp: exponential families and conjugate duality (WJ, 3)
F 26 Sep Variational principle, exact inference on trees (WJ, 4)
F 3 Oct Mean-field and structured mean-field (WJ, 5)
Statistical Physics
F 10 Oct Bethe free energy and sum-product (WJ, 6)
F 24 Oct Kikuchi free energy and generalized sum-project (WJ 7)
Convex Relaxations
F 31 Oct Convex relaxations (WJ 8) Piyush
F 7 Nov Semi-definite relaxations (WJ 9) Nathan
F 14 Nov Mode computations (WJ 10)
F 21 Nov Exact max-product and outer bounds (GJ07)
F 5 Dec Outer bounds on the marginal polytope (SJ07 and SMGJW08)

Paper Summaries

29 Aug:

Blah.

Past Semesters

Personal tools