This course aims to start from the basic fundamentals of Bayesian inference and slowly evolves to cover more recent advances. The focus is on methods and algorithms closer to the application level, but with the hope of providing a solid theoretical foundation.
This course will cover the following topics
1. Probability and stochastic processes
2. Prior distributions, improper priors, conjugate priors and their posterior distributions.
3. Bayesian and frequentist hypothesis testing, inference for simple problems, model comparison, Bayesian decision theory.
4. Simulation methods for Bayesian Computation, basic Monte Carlo, rejection sampling, importance sampling, MCMC
5. Different forms of MCMC: Metropolis-Hastings, Gibbs sampling, Slice Sampling, Hamiltonian Monte Carlo
6. Inference and the EM algorithm: bayesian regression, latent variables and the EM algorithm, Gaussian mixture models
7. Variational Inference: mean field approximation, variational Bayesian approach to Linear Regression, sparse Bayesian learning
8. A series of case-study examples, with a focus on biological problems

- Teacher: Matthew ROBINSON