Machine Learning

Prof. Fred Hamprecht

Machine Learning seeks to make valid predictions, or extract meaningful patterns, from data. Its potential motivates enterprises to collect vast amounts of data on our communication, consumption, daily routines, etc. An abundance of data coupled with ample compute power implies power to do good or bad.

This course focuses on fundamental machine learning techniques for unsupervised learning (dimension reduction, visualization, cluster analysis) and supervised learning (regression, classification). This is not a course on deep learning, and if you are looking for LSTMs, GANs and the latest buzzwords you should look elsewhere. Exercises in python will ask you to implement and / or apply the methods we survey.

Requirements: You should be comfortable with applied linear algebra and multivariate calculus and be / become able to implement algorithms in python. You do not need to be a genius to attend this course, but do expect it to bring significant workload for the exercises, and reading up. The course is mostly black board; there is no script, and you are expected to take your own notes. For physics students, this course is eligible as part of the MSc specialization "Computational Physics".

Exercises: We offer weekly exercise sheets that you can use to deepen your understanding of the topics discussed in the lectures. To get these corrected, please register for the lecture via MUESLI. Note that these exercise sheets are optional. In particular, you can attend the lecture and participate in the final exam without having handed in your exercise solutions. However, to maximize your benefit from the course we highly encourage you to solve them and to participate in the tutorial sessions.

When and Where

Lectures: Monday and Wednesday 11:15-13:00 in INF 227/HS2. First lecture: Monday, October 14th, 2019

Tutorials: Thursday and Friday 16:00-18:00 (c.t.) in INF 227/HS1 (Thu) and HS2 (Fri) (You should attend one of the two time slots.)

Written Exam: Wednesday, January 29th, 2020.

Wiederholungsprüfung: UPDATE: Friday, July 10th, 2020.

Your teacher and assistants

Fred Hamprecht has twenty years' experience of developing and applying machine learning methods, especially in computer vision.

Manuel Haussmann and Constantin Pape are a mathematician and physicist gone computer "rocket" scientist. Till Bungert was in your shoes two years ago and has already tutored a similar course last year.

Exercises

Tentative Schedule

We will go over unsupervised and supervised learning in several passes, in the following order (preliminary).

  • Principal Component Analysis (PCA)
  • Density estimation
  • Basic clustering techniques
  • Classification: k-NN, decision trees
  • Multivariate distributions
  • Classication: Quadratic Discriminant Analysis, statistical learning theory
  • Linear regression
  • Ridge regression, lasso
  • Gaussian Processes
  • Logistic regression, generalized linear models
  • Multi-layer perceptrons
  • Deep neural networks
  • Directed probabilistic graphical models
  • Hidden Markov Models
  • Kalman filter
  • Gaussian Mixture Model
  • Cluster analysis
  • Graph partitioning
  • Network analysis
  • Graph-based semi-supervised learning
  • Graph-based dimension reduction
  • Bilinear decompositions
  • Ethics of ML