Dr. Bogdan Savchynskyy, WiSe 2022/23
Summary
This lecture belongs to the Master in Physics (specialization Computational Physics, code "MVSpec"), Master of Applied Informatics (code "IOML") as well as Master Mathematics programs, but is also open for students of Scientific Computing and anyone interested.
The course presents various existing optimization techniques for such important machine learning tasks, as inference and learning for graphical models in combination with neural networks. In particular, it addresses such topics as combinatorial algorithms, integer linear programs, scalable convex and non-convex optimization, convex duality theory, learning of combinatorial algorithms with neural networks. Graphical models play a role of a working example along the course. The goal of the course is to give a strong background for analysis of existing, and development of new scalable optimization techniques for machine learning problems.
Schedule and Information
The lectures and exercises will be given in English .
Venue:
The first two lectures, on October 17 and 19 will be given in Institut Theoretische Physik, Kleiner Hörsaal (Philos.-weg 12 / kHS)!
All other lectures and exercises will take place in Mathematikon B (Berliner Str. 43), SR B128 Entrance through the door at the side of Berlinerstrasse. Ring the door bell labelled "HCI am IWR" to be let in. The seminar room is on the 3rd floor.
Lecture schedule:
- Lecture: Mo, 11:00 – 13:00,
- Lecture: Wed, 14:00 – 16:00, the first lecture will be given on October 17
- Exercises: Wed, 16:00 – 18:00, the first exercise will take place on October 26
Contact: Dr. Bogdan Savchynskyy , Sebastian Stricker (email available in Muesli)
In case you contact me via email, its subject should contain the tag [OfML]. Emails without this tag have a very high chance to be lost and get ignored therefore!
The seminar Optimization in Machine Learning and Computer Vision complements this lecture by taking a closer look at recent results and developments. We highly recommend it to all students interested in the topic.
Registration
Please register for the course in Müsli.
Course Material and Exercises
Are regularly uploaded to HeiBox. The video of the lectures given in winter term 2021/22 as a block-course can be found at the respective web-page.
Table of Contents
I Inference in Graphical Models
- Acyclic Graphical Models. Dynamic Programming
- Background: Basics of Linear Programs and Their Geometry
- Inference in Graphical Models as Integer Linear Program
- Background: Basics of Convex Analysis and Convex Duality
- Duality of the LP Relaxation of Inference Problem
- Background: Basics of Convex Optimization
- Sub-Gradient and Block-Coordinate Ascent for Inference in Graphical Models
- Lagrangian (Dual) Decomposition
- Min-Cut/Max-Flow Based Inference
- LP Relaxation of Inference Problem as st-Min-Cut Problem
- Summary: Inference Algorithm Selection
II Joint Learning of Graphical Models and Neural Networks
- Learning of Combinatorial Algorithms: General Concepts and Definitions.
- Learning of Combinatorial Algorithms with Neural Networks: Structured Risk Minimization.
- Learning of Combinatorial Algorithms with Neural Networks: Black-Box Differentiation.
Literature
- Text-book: B. Savchynskyy. Discrete Graphical Models - An Optimization Perspective
- Learning of combinatorial algorithms: Black-Box Differentiation Method