# Numerical Algorithms

Schedule: MW, 0930-1100, A-201.

## Lectures

• 2018-04-16. Reducing linear programming to a sequence of polyhedral cone feasibility problems. Polynomial (arithmetic) complexity of linear programming via interior point methods.

• 2018-04-11. Interior point methods for solving linear programs: finding the optimal basis from an approximate solution.

• 2018-04-09. Ill-posed linear programs. Connections with degeneracy.

• 2018-04-08 (Make up class for 2018-04-04). Degeneracy and optimal bases. Heaviness and degeneracy.

• 2018-04-02. Existence of optimal bases.

• 2018-03-28. Preliminary results on polyhedra, faces and vertices. Basic solutions of linear programs.

• 2018-03-26. Polyhedral cone feasibility and the IPM: analysis in terms of the condition number.

• 2018-03-21. Solving the polyhedral cone feasibility problem with the primal-dual interior point method.

• 2018-03-19. Analysis of the primal dual interior point update.

• 2018-03-14. The primal-dual interior point method. Duality gap as a potential. Central neighborhood.

• 2018-03-12. Complementary slackness. Examples of LP Duality: the min-max theorem.

• 2018-03-07. Duality in linear programming.

• 2018-03-05. The Ellipsoid algorithm (continued).

• 2018-02-28. Separating hyperplane theorem. Dual cones and the Farkas’s lemma, leading to the conclusion of the condition number bound. The Ellipsoid algorithm.

• 2018-02-26. Condition number for systems of linear inequalities with integral coefficients.

• 2018-02-21. The rescaled perceptron (continued). Complexity analysis, and review of integration on surfaces

• 2018-02-19. The rescaled perceptron algorithm, towards solving system of linear inequalities in polynomial time*

• 2018-02-14. Numerical algorithms for decision problems: polyhedral cone feasibility and system of linear inequalities. The perceptron algorithm

• 2018-02-12. Conjugate gradients: error bounds. Numeric pitfalls with finite precision implementations of conjugate gradients

• See julia/jupyter notebook (HTML Output) for comparison of conjugate gradients with steepest descent, and for numerical problems that can arise when implementing conjugate gradients with fixed precision, even in well-conditioned inputs, along with a possible solution.
• 2018-02-07. Conjugate gradients: basic notions, optimization over Krylov subspaces.

• 2018-02-05. Analysis of steepest descent, Krylov subspaces.

• 2018-01-24. Solving $$Ax = b$$ for positive definite $$A$$: steepest descent.

• 2018-01-22. Condition number of matrix inversion, condition number of linear system solving.

• 2018-01-17. Forward and backward stability. Backward stability of pairwise summation.

• 2018-01-15. Simple examples of gradient descent and Newton’s method. Floating point arithmetic. Condition number.

• Code, HTML Output. To run the code, you need to install julia and jupyter. Once you have both julia and jupyter up and running, you can run the command

jupyter notebook

in the directory in which you decompressed the code files, and jupyter will open a tab in your browser where you can interactively run the code.

## Assignments

• Homework 3. Due May 2. Last updated version hash: 6bfc9da, April 30.

• Homework 2. Due April 1. Last updated version hash: 2432387, March 12.

• Homework 1. Due February 10, 2018. Last updated version hash: 70698ce, January 28.

## References

1. Condition: The geometry of numerical algorithms. Peter Bürgisser and Felipe Cucker.

2. Lecturs on modern convex optimization. Aharon Ben-Tal and Arkadi Nemirovski.

3. $$Lx = b$$. Nisheeth Vishnoi.