The goal of the course is to present fundamental concepts in Information Theory and describe their relevance to emerging problems in Data Science and Machine Learning. Specific topics include basic measures of information, compression and quantization, exponential families, maximum entropy distributions, elements of statistical signal processing and optimum estimation.
Target group: interns, PhD students of any year, postdoc, anyone who is interested
Prerequisites: strong background in probability and linear algebra
Evaluation: Homework (no final exam)
Teaching format: Two lectures per week with regular homework
ECTS: 3 Year: 2020
Track segment(s):
CS-AI Computer Science - Artificial Intelligence
DSSC-PROB Data Science and Scientific Computing - Probabilistic Models
Teacher(s):
Marco Mondelli
Teaching assistant(s):
If you want to enroll to this course, please click: REGISTER
- Teacher: Marco Mondelli