Pour en savoir plus : http://www.math.ens-cachan.fr/version-francaise/formations/master-mva/contenus-/computational-statistics-223046.kjsp?RH=1242430202531
et https://mvamcmc.wordpress.com/
Intervenants : Gersende Fort (Télécom ParisTech), Eric Moulines, Jean-Baptiste Shiratti (Polytechnique)
cf aussi http://datascience-x-master-paris-saclay.fr/le-master/structure-des-enseignements/
A recent survey places the Metropolis algorithm among the ten algorithms that have had the greatest inuence on the development and practice of science and engineering in the 20th century (Beichl & Sullivan, 2000). This algorithm is an instance of a large class of sampling algorithms, known as Markov chain Monte Carlo (MCMC). These algorithms have played a signicant role in statistics, signal and image processing, physics and computer science over the last two decades. There are several high-dimensional problems for which MCMC simulation is the only known general approach for providing a solution within a reasonable time.
In this course, we present an overview of the MCMC techniques. We first introduce the basic algorithms (Hastings-Metropolis, Gibbs sampling) with examples to image processing and machine learning. We then cover some basic elements of stability and convergence of Markov chains, with applications to the convergence of MCMC. At the end of the course, several "hot" topics in MCMC simulation will be covered.
20 heures de cours et 10h de mini-projets encadrés
- Enseignant responsable de l'UE: Florence Tupin