Discrete Computation Graphs

Course Description

The enormous success of deep learning is partially due to the simplicity of the backpropagation algorithm, which allows one to efficiently compute the gradient of any loss function defined as a composition of differentiable functions. However, in a variety of problems originating in supervised, unsupervised, and reinforcement learning, computation graph includes a collection of discrete components such as discrete random variable, graphs, or even programs. By the end of this course, you will have learnt contemporary methods that allow efficient training of such models.

Course topics

Bayesian deep learning, latent variable models, latent structure.

Course tools

Python, numpy, PyTorch.

Prerequisites

Hard requirements: You will need some basic knowledge of linear algebra, calculus, probability theory, statistics (mainly different concepts around estimators), Python. Optional, but desirable: neural networks, graphical models, variational inference, reinforcement learning, natural-language processing.

 

Lecturer

Havrylov

Serhii Havrylov

PhD student at University of Edinburgh, Institute for Language, Cognition and Computation

Serhii is a PhD student at the ILCC, the University of Edinburgh. Previously, he had been working on his thesis at the ILLC, University of Amsterdam. Prior to starting PhD, Serhii had been working as a research engineer a little bit more than three years. He has Bachelor’s and Master’s degrees in applied mathematics from the National Technical University of Ukraine “KPI”.

Fields of interests: Natural-language processing, Bayesian methods, Deep Learning.

Contacts[email protected]
facebook.com/havrylov
linkedin.com/in/serhii-havrylov-666796a7