Concentration inequalities provide quantitative insight how functions of (independent) random variables deviate from their expectation. In the course, we will discuss classical concentration inequalities, such as Hoeffding’s or McDiarmid’s, as well as recent extensions. In additional to the mathematical treatment, we will discuss application in machine learning and potentially other fields. If there is sufficient interest, a second part of this course will be taught in Fall 2020.

Target group: PhD students from all years, postdocs, anyone else who is interested.

Prerequisites: Strong background in probability and analysis.

Evaluation: pass/fail based on homework.

Teaching format: Classroom lectures, homework

ECTS: 3 Year: 2019

Track segment(s):
CS-AI Computer Science - Artificial Intelligence
DSSC-PROB Data Science and Scientific Computing - Probabilistic Models
MAT-PROB Mathematics - Probability

Teacher(s):
Christoph Lampert Jan Maas

Teaching assistant(s):

If you want to enroll to this course, please click: REGISTER