Algorithms Seminar/Fall11

From ResearchWiki

Revision as of 19:22, 19 October 2011 by Schizoid (Talk | contribs)
Jump to: navigation, search

CS 7936: Heuristic algorithm design strategies

Wed 1:25-2:45pm

MEB 3147 (LCR)

Synopsis

You've been there before. After the long and futile attempts at finding an exact algorithm. After the proof of NP-hardness. After the hours spent trying to get a decent approximation, and the realization that the problem is impossible to even approximate in polynomial time. You could just do brute force search, but you might as well give up your identity as a computer scientist. The siren calls of a black box MATLAB library seems so soothing to your ears, but you feel deep down that THERE MUST BE A BETTER WAY.

If this person is you, then this seminar is the cure.

What we will cover are metaheuristics. These are algorithm design strategies that apply in a variety of (usually) optimization problems. While they usually don't have strong guarantees on behavior, they tend to work very effectively in practice. Gradient descent is a simple example, as is simulated annealing, or iterative reweighted least squares. Some heuristics actually do have formal guarantees in specific situations, but are often used outside that realm successfully. Gradient descent is again an example of this, as is alternating optimization.

The goal of the seminar is

  • identify when such heuristics might be applicable to a given problem
  • understand the formal limits on what these heuristics can and cannot do.
  • (If time permits) study some of the associated complexity theory about heuristic search (what classes of problems admit good heuristic strategies, and how are these problems related to each other)

What is expected of you:

  • read the prescribed papers
  • make at least one full presentation on one of the topics
  • participate in class discussions

You can take the seminar for anywhere between 1 and 3 credits: more credits means more presentation.

Participants

Schedule

Date Topic Outline and Paper(s) Presenter
Smooth methods
Aug 24 Background material: convexity, nonconvexity, convergence, rates of convergence Suresh Venkatasubramanian
Aug 31 Smooth local search I: Simple descent The source material is Chaper 9 of Convex Optimization, and there are slides for that material as well. Topics to be covered include basic setup/defns, descent algorithms including gradient descent and steepest descent, and the associated convergence analysis. We won't talk about self-concordance Amirali Abdullah
Sep 7 Smooth local search II: Newton's method Chaper 9 of Convex Optimization, with additional reading from the lecture slides on Nonlinear Optimization (starting at page 45/201) John Moeller
Discrete Local Search: Basics
Sep 14 Discrete Local Search: approximations for NP-hard problems Chapter 2 of Williamson/Shmoys (starting at 2.3). Specifically, scheduling, TSP and min-degree spanning trees. Another nice example of local search that gives a guarantee is the 5-approximation for the k-median by Arya et al. Since the link appears to be down occasionally, here's an explanation of the algorithm. Ruihong
Sep 21 Discrete Local Search: heuristic search strategies Theoretical aspects of local search. Chapters 1.1, and 7 (neighborhood functions, basics of the neighborhood graph and connectivity, local improvement methods, tabu search and simulated annealing) Samira Daruki
Coordinate Descent methods
Sep 28 Coordinate descent methods: Basics the basic framework, and applications to alternating optimization, matching pursuit and projection pursuit. Original work on the convergence of cyclic coordinate descent, and a nice modern work analyzing the behavior of cyclic methods. Ruihong
Oct 5 A case study in coordinate descent: Frank-Wolfe optimization Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm - there's more at Ken Clarkson's webpage. Good notes on Clarkson's paper by Daniel Hsu Suresh Venkat
Oct 19 Alternating Optimization: k-means, ICP and others What is alternating optimization. What is k-means?, k-means convergence 1 and 2. What is ICP?, Fast ICP Parasaran Raman
Oct 26 Matching, Basis and Projection Pursuit A good comparison of matching pursuit and basis pursuit, an overview of projection pursuit. Source material: The Friedman and Tukey projection pursuit paper, the Mallat and Zhang matching pursuit paper, and the Chen et al basis pursuit paper Samira Daruki
Learning from experts
Nov 2 Multiplicative weight updates: theory Survey by Arora, Hazan and Kale Shobhit Gupta
Nov 9 Multiplicative weight updates: examples Survey by Arora, Hazan and Kale Parasaran Raman
Miscellaneous
Nov 16 Branch and bound Best reference is chapter II.4.2 of Nemhauser and Wolsey. See also Brian Borcher's thesis and a nice review of branching methods for exact solutions John Moeller
Nov 23 Iteratively reweighted least squares the basic method and applications to computing Fermat-Weber points as well as $\ell_1$-minimization Shobhit Gupta
Complexity Theory
Nov 30 Complexity theory: PLS the original paper, and David Johnson's review article Amirali Abdullah
Dec 7 Complexity theory: PPAD David Johnson's review article
Personal tools