Statistics and optimization in high dimensions
The class will be taught in English.
This page contains supporting materials, exercises and practical sessions. Practicals sessions are given in the form of a Python notebook which can be run using Jupyter or online using Google colab.
Feedback form
Students who took the course are invited to fill the following feedback form.
Class sessions
Session 1: Introduction, sub-gaussian random variables
Session 2: Linear regression
Session 3: Penalized linear regression, compressed sensing
Session 4: Computation, Complexity, Conic Hierarchy.
Session 5: First order methods.
Session 6: Stochastic algorithms.
Stochastic approximation, algorithms for large sums, convergence rates.
Block decomposition method, convergence rates for random blocks.
Supporting material
Slides
Consider the Lasso problem from the previous Practical session, try stochastic proximal gradient and block proximal gradient. Rescale the iteration counter so that each method perform roughly the same number of basic vector operation at each iteration. Compare batch composite optimization methods.
|