Module Number: EI70360
Duration: 1 semester
Occurence: Winter semester
Number of ECTS: 5
Professor in charge: Reinhard Heckel
Contact hours: 60
Students have to take a written exam of two hours duration. In the exam, the students will answer questions on the machine learning concepts and algorithms mentioned above. The exam tests whether students understand and can adapt advanced machine learning techniques such as deep neural network, and can analyze their performance, for example by giving simple bounds on their sample complexity or computational complexity. Lecture notes are permitted in the exam, but no computer will be needed or is allowed.
Analysis 1-3, Introductory classes in Statistic or Probability Theory
The course introduces the theory and practice of advanced machine learning concepts and methods (such as deep neural networks). In particular we will discuss (statistical) learning theory, (deep) neural networks, first order optimization methods such as stochastic gradient descent and their analysis, the interplay of learning and optimization, empirical risk minimization and regularization, and modern views of machine learning in the overparameterized regime with deep neural networks. We also discuss automatic hyperparameter optimization, active learning, and aspects beyond performance such as fairness. We will start the lecture with a very brief review of the foundations of machine learning such as simple regression and classification methods, so that all students are on the same page.
Upon successful completion of the module students know the theoretical foundations of (advanced) machine learning algorithms and common optimization methods for machine learning, and how to develop and analyze such algorithms. Students are able to (i) apply advanced and build new machine learning methods by modifying existing ones (for example deep neural networks), (ii) develop and tune optimization algorithms for training such models, (iii) rigorously analyze their performance both with computational experiments as well as by proving generalization bounds and analyzing the convergence/computational complexity of training algorithms. Also, upon successful completion, students are familiar with concepts beyond the traditional supervised learning setup, in particular active learning and aspects such as fairness
Machine learning algorithms and methods are introduced and discussed during lectures, with a focus on the theory behind the methods, and including recently develop results. Exercises with both theory and coding problems are handed out every second week, and whenever a new exercise is handed out, solutions for the previous one are distributed. We will sometimes give deliberately open questions and problems, so that students practice to adapt methods, build on existing, and develop an understanding on how to approach practical and research questions in the real world. The discussion session has an interactive format in that it is a forum for asking specific questions about the exercises and the methods introduced in the lectures, and discussing certain problems or parts of the lecture in more detail on the board, but only on request by the students during the discussion session.
The material is presented on the boad, sometimes code and algorithms are shown with a projector. Lecture notes and exercises are distributed
We do not follows a textbook, lecture notes will be distributed. Helpful references include: ``Elements of Statistical Learning'' by Hastie, Tibshirani & Friedman; ``Machine Learning'' by Tom Mitchell ; ``Foundation of Machine Learning'', by Mohri, Rostamizadeh, and Talwalkar; ``Understanding Machine Learning: From Theory to Algorithms'' by Shalev-Shwartz and Ben-David