MPI-INF Logo

Adfocs

22nd Max Planck Advanced Course on the Foundations of Computer Science

22nd Max Planck Advanced Course on the Foundations of Computer Science

July 26 - August 13 2021

Virtual event (hosted from Saarbrücken, Germany)

Convex Optimization and Graph Algorithms


The goal of this year's ADFOCS is that people with traditional TCS background learn the continuous optimization techniques which have become indispensable tools for modern graph algorithms, LP solvers, and numerous other applications.

Unlike previous ADFOCS, this year the event will take place over the span of three weeks. There will be a talk every day from 16:00-18:00 CEST from July 26 to August 13. We will start with a primer week to learn the very basics of continuous optimization (July 26 - July 30), followed by two weeks of talks by the speakers on more advanced continuous optimization and its use in graph algorithms (August 2 - 13).

The event will take place entirely online and there is no cost for registration.

This summer school's scope is international, and its goal is to bring together leading researchers with international participants of graduate level and above.

The registration is now open (see the Registration tab) and we have the preliminary schedule (see the Program tab).


Alina Ene

Alina Ene

Boston University

Adaptive gradient descent

Rasmus Kyng

Rasmus Kyng

ETH Zurich

Graphs, sampling, and iterative methods

Aaron Sidford

Aaron Sidford

Stanford University

Optimization Methods for Maximum Flow

ADFOCS is organized by Karl Bringmann, Alejandro Cassis, Cosmina Croitoru, Themis Gouleakis, Andreas Karrenbauer, Kurt Mehlhorn and Vasileios Nakos as part of the activities of the Algorithms and Complexity Group and the International Max Planck Research School of the Max Planck Institute for Informatics.

Contact

Please do not hesitate to contact us for any questions via email to adfocs@mpi-inf.mpg.de.

Coming

This year the event will take place over the span of three weeks. We will start with a primer week with the goal of learning the basics of convex optimization. This is meant to cover the necessary background for the talks given by the invited speakers, which start in the second week.

Schedule


CEST (UTC+2) Monday Tuesday Wednesday Thursday Friday
July 26 July 27 July 28 July 29 July 30
16:00-18:00 Primer I Primer II Primer III Primer IV Primer V
August 2 August 3 August 4 August 5 August 6
15:00-16:00 Alina Exercises I Alina Exercises II Rasmus Exercises I
16:00-18:00 Alina I Alina II Alina III Rasmus I Rasmus II
August 9 August 10 August 11 August 12 August 13
15:00-16:00 Rasmus Exercises II
16:00-18:00 Aaron I Aaron II MPI talks Rasmus III Aaron III
18:00-19:00 Aaron Exercises I Aaron Exercises II

Lecture Topics

The topics of the lectures are the following:

Primer

  • Lecture I: Math background. Karl Bringmann.
  • Lecture II: Introduction to Convex Optimization. Vasileios Nakos.
  • Lecture III: Gradient Descent. Alejandro Cassis.
  • Lecture IV: Mirror Descent. Kurt Mehlhorn.
  • Lecture V: Spectral Graph Theory. Andreas Karrenbauer.

Alina Ene

  • Lectures I+II: Adaptive gradient descent for convex minimization
  • Lecture III: Adaptive mirror prox algorithms for variational inequalities

Rasmus Kyng

  • Lectures I+II: Laplacian solvers
  • Lecture III: p-norm regression with acceleration

Aaron Sidford

  • Lectures I+II: First-order methods for Undirected Graphs
  • Lecture III: Interior point methods for Directed Graphs

The registration for the lectures is free and open for everyone. If you want to participate in the exercise sessions, you will need to provide either a motivation letter or a recommendation letter from your advisor. If you choose to do this, please mark the check-box in the registration link! We will contact you to provide the letter at a suitable time.

If you are a student (including PhD student) please register here: student registration

Otherwise please register here: regular registration

Primer Week

  • Lecture 1: Math background. [K] 2 [V] 1,2,3
  • Lecture 2: Introduction to Convex Optimization. [B] 1 [V] 4,5
  • Lecture 3: Gradient Descent. [B] 3 [K] 3 [V] 6
  • Lecture 4: Mirror Descent. [B] 4 [V] 7
  • Lecture 5: Spectral Graph Theory. [K] 1,4,7

Reading Material

ADFOCS supports the recommendations of the SafeTOC report to combat harassment and discrimination in the Theory of Computing community.

By participating in the ADFOCS program, you agree to not exercise any demeaning, discriminatory, or harassing behavior and speech. More precisely, we will adhere to the guidelines as recommended by the ACM/STOC code of conduct.

If at anytime you feel unsafe, intimidated or harrassed during ADFOCS, please contact the organizers immediately and we will start an (anonymous) investigation.

The organizers reserve the right to remove any participant from the ADFOCS summer school.




Imprint-Dataprotection