Course notes: Computational Statistics and Machine Learning (Fall 2020 by Nicholas Zabaras via YouTube)

Last modified: 1 minute read, with 222 words. Post views:

This post records the notes when I learnt the Computational Statistics and Machine Learning by Nimalan Arinaminpathy.

I just randomly come across to this course by searching for “particle filters” on YouTube. I found the description of course fascinating and it seems that it can help me building infectious disease models (e.g. Simulation-based Inference for Epidemiological Dynamics).

The course covers selective topics on Bayesian scientific computing relevant to high-dimensional data-driven engineering and scientific applications. An overview of Bayesian computational statistics methods will be provided including Monte Carlo methods, exploration of posterior distributions, model selection and validation, MCMC and Sequential MC methods and inference in probabilistic graphical models. Bayesian techniques for building surrogate models of expensive computer codes will be introduced including regression methods for uncertainty quantification, Gaussian process modeling and others. The course will demonstrate these techniques with a variety of scientific and engineering applications including among others inverse problems, dynamical system identification, tracking and control, uncertainty quantification of complex multiscale systems, physical modeling in random media, and optimization/design in the presence of uncertainties.

Lecture 1 - Introduction to Probability and Statistics

  • Discrete random variables
  • Joint probability
  • Baye’s theorem
  • Conditional independence

    Lecture 2 - Introduction to Probability and Statistics (continued)

  • Random variables, CDF, mean and variance
  • Uniform distribution, Gaussian, the binomial and bernoulli distributions
  • The multinomial, nultinoulli distributions, Poisson distribution
  • Emprical distribution

Tags: ,

Categories:

Comments