The goal of the course is to present fundamental concepts in Information Theory and describe their relevance to emerging problems in Data Science and Machine Learning. Specific topics include basic measures of information, compression and quantization, exponential families, maximum entropy distributions, and elements of statistical learning.
Target group: interns, PhD students of any year, postdoc, anyone who is interested
Prerequisites: strong background in probability and linear algebra
Evaluation: Homeworks (no final exam)
Teaching format: Two lectures per week with regular homeworks
ECTS: 3 Year: 2021
Track segment(s):
CS-AI Computer Science - Artificial Intelligence
DSSC-PROB Data Science and Scientific Computing - Probabilistic Models
Teacher(s):
Marco Mondelli
Teaching assistant(s):
If you want to enroll to this course, please click: REGISTER
- Teacher: Marco Mondelli
- Teaching Assistant: Alex Shevchenko