# Topics in the study of Markov chains

**Time**: Mondays and Thursdays, 1600-1730.

**Place**: Online lectures. YouTube playlist. Student presentations in A-201/A-237.

# Student presentation topics

Basic of Hamiltonian Monte Carlo.

Localization schemes for mixing time proofs.

# Lecture notes/transcripts

**2022-03-07**.*Some other sampling primitives: rejection and importance sampling. Sampling solutions to \(k\)-DNF formulas.*[Transcript].**2022-03-03**.*Monomer dimer chain: Lower bound on the Poincare constant (sketch).*[Transcript, including material from previous class].**2022-02-28**.*Monomer dimer chain.*[Transcript, including material from the next class].**2022-02-24**.*Multi-commodity flows and Poicare ineqalities. Method for bounding congestion of flows. Examples.*[Transcript].**2022-02-21**.*Eigenvalue spectrum of symmetric and reversible chains (contd). Poincare inequalities.*[Transcript, including material from the previous class].**2022-02-17**.*Gibbs sampling*[Transcript, including material from the previous lecture].*Eigenvalue spectrum of symmetric and reversible chains*[Transcript].**2022-02-14**. [Transcript, including material from previous lecture, and partly from the following lecture]*Sampling random proper colourings (contd.). Metropolis-Hastings rule.***2022-02-10**. [Transcript, updated with material from following lecture]*Path coupling. Reversible chains. Random proper colourings.***2022-02-07**. [Transcript]*Couplings and mixing times. Using contraction in distance. Mixing time of the random walk on the hypercube.***2022-02-03**. [Transcript]*Couplings.***2022-01-31**. [Transcript]*Proof of the “Fundamental Theorem”.*- See also the webpage for the previous offering of the course for
`julia`

programs exploring convergence of some simple Markov chains.

- See also the webpage for the previous offering of the course for
**2022-01-27**. [Transcript]*Introduction.*

## General information

This is a two credit course. The initial half (about 20 hours) is a regular lecture course, comprising an introduction to Markov chains. After this part, the second half will be a “reading-course” comprise student presentation exploring various current topic of interest in the subject. The introductory lectures will be a selection from the following topics (since this is a reading course, the list may change based on how the course progresses):

The fundamental theorem of Markov chains. Canonical examples of Markov chains: Glauber dynamics and the Metropolis sampler, Random walks of graphs (especially the Boolean hypercube), etc.

Coupling and associated methods for bounding mixing times: examples from hypercube random walks and Glauber dynamics.

Spectral methods and combinatorial methods for bounding the spectral gap. Examples: Ferromagnetic Ising model, Matching in general graphs.

Recent progress using high dimensional expanders and its connection with the geometry of polynomials.

An introduction to Markov chains on continuous state spaces. Sampling from convex bodies.

Aside from this, I also plan to cover a selection (based on interest) from the following list of tentative topics.

Coupling and algorithms for “perfect sampling”: the Propp-Wilson algorithm (including results on perfect sampling of colourings by TIFR students!).

Sharper mixing time bounds from log Sobolov inequalities and hypercontractivity. Connections with concentration of measure.

Local algorithms: the evolving set Markov chain and its use in finding local sparse cuts

Concentration inequalities for Markov chains.

### Prerequisites

In addition to undergraduate algorithms and discrete mathematics, the only other prerequisites to be some familiarity with linear algebra (the notions of eigenvalues, eigenvectors and eigenvalue decompositions) and analysis (notions of convergence, norms, Hölder’s inequality etc.).

### References

For the initials part of the reading course, the book *Markov chains and mixing times* by Galvin, Peres and Witmer, and the monograph *Lectures on finite Markov chains* by Saloff-Coste will be useful. We will refer to original papers as needed.

### Evaluation

Evaluation will be based on 3 homework assignments (70% of the final grade) and a final exam (30% of the final grade). Total number of meeting hours are expected to be about 30. However, since this is partly a reading course, participating students might have to prepare and give a few lectures towards the end of the course, and in general would also have to prepare in advance for the material to be discussed in each meeting.

### Other courses

- Constantinos Daskalakis’s course
- Shayan Oveis Gharan’s course
- Dana Randall’s course - See this course specially for many interesting statistical physics connections that we will not get to in our course.
- Alistair Sinclair’s course
- Eric Vigoda’s course

The webpage for a previous lecture version of this course is available here. The initial few lectures will be virtually identical this time too, but I expect to cover a different selection of the more “advanced” material this time.