Dr Susan Wei,
The University of Melbourne

Neural Networks and Related Models

(Please note: This course is co-taught by Dr Robert Salomone)

This course is an introduction to deep learning as well as some probabilistic models involving neural networks (flow-based models and deep generative models).

Part I: Deep Learning Basics and Models

  • An Introduction to Neural Networks: key components of DL pipeline, multilayer perception, forward/backward propagation, computational graphs
  • Stochastic Optimization and Extensions
  • The Art of Model Training and Regularization: Model selection, weight decay, dropout, initialization
  • Convolutional Neural Networks and Recurrent Neural Networks

Part II: Variational Inference, Normalizing Flows, and Deep Generative Models

  • An Introduction to Variational Inference
  • Normalizing Flows (for both Variational Inference and Density Estimation)
  • Deep Generative Models: Variational Autoencoders & Generative Adversarial Networks.

Computational Demonstrations will use the Python package PyTorch. The latter part of the course will involve demonstrations using the probabilistic programming language Pyro, which is based on PyTorch and has additional features.

Pre-requisites

Prior Mathematical Knowledge: Calculus, Linear Algebra, Probability, and Statistics at an advanced undergraduate level.
Prior Computational Experience: The computational aspects of this course use the programming language Python. Some familiarity with Python would be advantageous. Basic familiarity with programming is required.

Pre-reading

Dr Susan Wei

Dr Susan Wei,
The University of Melbourne

Susan is a Lecturer in the School of Mathematics and Statistics at the University of Melbourne. She obtained her Statistics PhD in 2014 at UNC Chapel Hill, USA under the direction of Professors J.S. Marron and Michael Kosorok. Prior to the University of Melbourne, she was a postdoctoral fellow at EPFL, Lausanne Switzerland (2014-2016) and a tenure-track Assistant Professor at the University of Minnesota (2016-2018). She is currently an ARC DECRA fellow working on algorithmic fairness in deep learning. More broadly, she is interested in the theoretical underpinnings of deep learning through such tools as singular learning theory.