http://www.cs.utah.edu/~suresh
suresh at cs utah edu
Ph: 801 581 8233
Room 3404, School of Computing
50 S. Central Campus Drive,
Salt Lake City, UT 84112.

WEB L112
Tue-Thu 3:40-5pm (Spring 2013)

Course Administration Page

The Geometry of Data

Computational geometry is the study of algorithms for geometric objects: points, curves, lines, surfaces, and so on.

But what makes computational geometry a vastly powerful discipline is where these points, lines, surfaces etc come from. It’s not just proteins, or 3D models, or maps (although they are very important). It’s virtually any problem involving the analysis of data, whether it be text, video, images, speech, or even movie recommendations.

Computational geometry is the structural underpinning of data mining: it gives data its “shape” and tells us how to manipulate it efficiently.

For more on this, read my short note: “Data, Dimensions and Geometry“.

Topics

The course will cover a variety of topics in computational geometry, with an emphasis on constructs that relate to data analysis. These include:

  • Foundations: Convex hulls, Voronoi diagrams, Delaunay triangulations, arrangements, duality, (orthogonal) range searching
  • Randomization: $\epsilon$-nets/samples, VC dimension, cuttings,
  • Optimization: linear programming and beyond
  • Approximations: grids, near neighbor searching, core sets, projections, reweighting techniques (MWU)
  • The Geometry of Learning: manifolds, kernels, Bregman divergences

Prerequisites

This course will assume competence in the material covered in CS 6150 (Advanced Algorithms), which means facility with algorithm analysis techniques, randomness and probability, approximation algorithms and basic complexity theory. No programming skills will be required.

Textbooks

The course will be drawn from material in:

I will provide additional material as needed.

Lectures

Core Material

  1. Convexity and convex hulls.
  2. Arrangements
  3. Voronoi Diagrams
  4. Delaunay Triangulations (some notes from David Mount)

Randomness

  1. $\varepsilon$-nets, $\varepsilon$-samples, VC dimension (Sariel’s notes, Chapter 5)
  2. VC dimension continued: How do we compute VC dimension of  a space ?
  3. More VC dimensions (shatter dimension, dual range spaces) and some PAC-learning.
    1. Three lectures on PAC learning (the second one is really a recap of VC-dimension: the first defines PAC learning and the third outlines an argument for the $\varepsilon$-net construction.
  4. Reweighting: the multiplicative-weight-update method. (Sariel’s book, chapter 6.3/6.4), and my blog post.

Optimization

  1. Low-dimensional linear programming (Seidel’s algorithm). (Section 9.2)
  2. High-dimensional linear programming: the simplex algorithm, and the ellipsoid method.

Approximations

  1. Grid-based approximations (Sariel’s notes, Chapter 1)
  2. Continuation of grid-based algorithms (minimum enclosing ball), and quad trees (chapter 2).
  3. Compressed quad trees and well-separated pair decompositions (Chapter 2,3)
  4. WSPDS continued, range trees and kd-trees.
  5. Near-neighbor searching (Chapter 11 from Sariel’s online notes, Chapter 17 in his book)
  6. Near-neighbor searching in high dimensions (Chapter 20 from the book)
  7. (Guest lecture by Jeff Phillips) Projections: the JL lemma and friends
  8. Core sets and extents.

The Geometry of Learning

  1. Kernels (Hal Daumé’s excellent review)
  2. SVMs
  3. Bregman divergences
  4. The geometry of graphs (spectral methods)

Note: While this is the order in which I will try to cover the topics, they may not have a one-one mapping with lectures. Some topics might span multiple lectures, and others might be compressed to fit in one.

 

Grading

Grading will be based solely on homeworks (5-6). There will be no exams or projects.