[The course will be given in English.]
Optimization underpins many engineering problems, where we are tasked with modelling and finding the optimal decision in a variety of contexts. Take as an example, finding the rectangle with the largest area among those sharing the same perimeter, or the optimal training of neural networks.
Optimization is also very present in everybody's life, when one has to ``optimize'' their agenda to fit all their activities and still find time to study for the exams.
In this course, we are going to present key concepts and results in differentiable and convex continuous optimization in finite dimension.
(We will leave out discrete optimization (also known as Operations research), and infinitely dimensional optimization (such as shape optimization).)
We will focus on theoretical aspects of differentiable and convex optimization, while algorithms for solving optimization problems are discussed in the subsequent lecture OPT202 (by Andrea Simonetto). Other directions and applications of optimization are also presented in the SOD program (third year).
The goals of the course are to
• understand what optimization is: what types of problems are usually considered and how, what are the basic assumptions (such as convexity), and how to interpret the output of an optimization problem
• learn to model applications as optimization problems
• master the basic optimization models and methods
• acquire the necessary prerequisites for formulating, analyzing and using optimization algorithms.
All the course materials will be available on this page.
#
Remarque: ce cours compte pour 3 ECTs pour l'obtention du M1-Mathématiques Appliquées
Optimization underpins many engineering problems, where we are tasked with modelling and finding the optimal decision in a variety of contexts. Take as an example, finding the rectangle with the largest area among those sharing the same perimeter, or the optimal training of neural networks.
Optimization is also very present in everybody's life, when one has to ``optimize'' their agenda to fit all their activities and still find time to study for the exams.
In this course, we are going to present key concepts and results in differentiable and convex continuous optimization in finite dimension.
(We will leave out discrete optimization (also known as Operations research), and infinitely dimensional optimization (such as shape optimization).)
We will focus on theoretical aspects of differentiable and convex optimization, while algorithms for solving optimization problems are discussed in the subsequent lecture OPT202 (by Andrea Simonetto). Other directions and applications of optimization are also presented in the SOD program (third year).
The goals of the course are to
• understand what optimization is: what types of problems are usually considered and how, what are the basic assumptions (such as convexity), and how to interpret the output of an optimization problem
• learn to model applications as optimization problems
• master the basic optimization models and methods
• acquire the necessary prerequisites for formulating, analyzing and using optimization algorithms.
All the course materials will be available on this page.
#
Remarque: ce cours compte pour 3 ECTs pour l'obtention du M1-Mathématiques Appliquées
- Enseignant: Zacharie ALES
- Enseignant: Sophie ALFEROFF
- Enseignant: Frédérika AUGÉ-ROCHEREAU
- Enseignant: Sorin-mihai GRAD
- Enseignant: Houssem HADDAR
- Enseignant: Mélanie LIMACHE GOMEZ
- Enseignant: Alejandro REYMOND
- Enseignant: Andrea SIMONETTO
- Enseignant responsable de l'UE: Laurent BOURGEOIS