Parameter-free optimization, universal prediction, and Kolmogorov complexity

Apply

Project Description

Online parameter-free optimization algorithms are optimization algorithms that do not require tuning of parameters, yet they provable achieve the optimal performance. The basic idea behind these algorithms is in the link between universal strategies for prediction with log loss and online convex optimization, first unveiled in Orabona&Pal (2016). By now, this line of work has produced a number of parameter-free optimization algorithms. However, universal prediction with log loss is also connected to the concept of Kolmogorov complexity for binary strings, as showed by Cover (1974). In this project, we aim at studying all these links, going from parameter-free optimization to Kolmogorov complexity, passing through universal prediction with log loss. We want to construct explicit reductions to transform each of these problem in one of the other ones. The final aim is to directly link convex optimization to an appropriate notion of computational complexity for functions. Moreover, we want to extend some of these concepts from binary strings to strings of bounded real numbers. Given the theoretical nature of this project, the ideal candidate must have an excellent mathematical background.
Program - Computer Science
Division - Computer, Electrical and Mathematical Sciences and Engineering
Field of Study - ​Computer Science, Mathematics or a related discipline

About the
Researcher

Francesco Orabona

Francesco Orabona

Desired Project Deliverables

Original research – contribution to a research paper​

RECOMMENDED STUDENT ACADEMIC & RESEARCH BACKGROUND

Knowledge of convex optimization
Knowledge of convex optimization
Knowledge of Kolmogorov complexity
Knowledge of Kolmogorov complexity