Inscription manuelle de participants

Natural language processing has given rise to innumerable industrial applications.
While many new tasks have emerged in NLP and speech processing over
the last decades, methods to solve them have increasingly converged towards
a unified modeling paradigm. In this course, we will use sequence-to-sequence
modeling to delve into state-of-the-art statistical machine learning methods —
convolutional neural networks, recurrent neural networks, attention, transformers
— and apply them to major NLP and speech processing tasks — language
modeling, machine translation, speech recognition, information extraction. Students
should expect to get an in-depth understanding of these methods, through
theoretical analysis and hands-on lab sessions. Grading will involve a project,
to be carried out over the course of the class.
Topics to be covered
1. Recurrent Neural Networks
2. Hidden Markov models
3. Attention Mechanisms
4. Transformers
5. Convolutional Neural Networks
6. Language Modeling

Les visiteurs anonymes ne peuvent pas accéder à ce cours. Veuillez vous connecter.