MLRG/fall08/programming
From ResearchWiki
The programming projects will be based on Yee Whye Teh's "EasyBP" framework, for more information, see his README file.
The code will be hosted on one of the CS machines via subversion. If you're not familiar with subversion, it's quite easy to use.
Contents |
Getting Started
To get started, you'll want to "check out" our local svn repository. To do this, run:
svn co svn://zook.cs.utah.edu/uusoc/facility/bigml/easybp
Here, "co" means "checkout." You'll only have to do this once.
This will create an "easybp" directory underneath whatever directory you're currently in. It will also download the latest version of all the code. If everything went okay, you should be able to read easybp/ezbp_v0.readme, which should contain the same readme information linked above.
How to use SVN, Briefly
The only commands you should really care about are:
svn up svn ci svn add
The first, "svn up", will update your local repository. Basically, it will download all changed files and, in the off chance there are conflicts with your local versions, will try to do something intelligent.
The second, "svn ci", will commit your recent changes to the repository. Try not to spend too much time between updating and committing, just to avoid conflicts. When you commit, it will prompt you for a commit message. Please write something somewhat informative. If you don't like it popping up an editor, you can say "svn ci -m 'my commit message'" and it will use the string you specify.
The final, "svn add", adds new files to the repository. For instance, if you create a file called "EP.m", you will need to say "svn add EP.m". Note that this will not upload the file -- it will just add it to your local repository. You'll need to follow the "svn add" command with an "svn ci" command to commit the changes.
Things to Implement
Not everything here has to be implemented; these are just some ideas. They tend to follow the readings. I've bolded things that I think we should not miss.
- Sum-product and max-product on trees (week 2)
- Junction tree algorithm (week 3)
- Mean field and structured mean field for Ising model (weeks 5 and 6)
- Loopy sum-product (aka belief propagation) (week 7)
- Generalized belief propagation (week 8)
- Tree-reweighed sum-product (week 9)
- Log-determinant relaxation (week 10)
- Max-product linear programming (week 11)
- Cutting plane linear programming (week 12)
- Cluster-based linear programming (week 12)
- Gibbs sampling (not covered; would be for fun)
- Expectation propagation (not covered; would be for fun)
- Survey propagation (not covered; would be for fun)
Some of these are more work than others. (1) will be a bit difficult just because we'll be new to the topic and new to the library. (2) is a bit tricky to get the details right, plus the fact that Matlab isn't the most natural language in which to implement graph stuff. (3) should be fairly short and easy. (4), given success at (1), should be straightforward. (5) and (6), given (4), should both be straightforward. The remainder I'm not sure about. Given built-in LP solvers, I don't think they'll be too bad.
Participants
(If you're involved in the programming, please copy your participant info from the main page to here.)
- Hal Daumé III, Assistant Professor, School of Computing
- Arvind Agarwal, PhD Student, School of Computing
- Avishek Saha, PhD Student, School of Computing
- Amit Goyal, PhD Student, School of Computing
- Seth Juarez, PhD Student, School of Computing