Lectures

Lecture 22, March 31st, 2016: Approximate Inference

In this lecture we will continue our discussion of probabilistic modelling and turn our attention to approximate inference.

Please study the following material in preparation for the class:

  • Chapter 19 of the Deep Learning Textbook on approximate inference.

In preparation for the following lecture, please study this paper as well, mentioned already in class:

 

Standard
Uncategorized

Ordering test set Dogs vs. Cats

The test set of the Dogs vs. Cats in Fuel was ordered based on strings instead of numerically (i.e. 1, 10, 11, …, 2, … instead of 1, 2, …). When submitting to Kaggle it expects the latter, which meant that the test scores did not make sense. This was fixed in Fuel pull request #336. If you are using the cluster, the dataset has been updated (if you need the old HDF5 file it can be found under dogs_vs_cats.old.hdf5). If you run your own installation, please update Fuel and rerun fuel-convert dogs_vs_cats.

Standard
Lectures

Lecture 21, March 24th, 2016: RBMs and Partition Function

In this lecture we will continue our discussion of probabilistic undirected graphical models such as the Restricted Boltzmann Machine.

Please study the following material in preparation for the class:

 

Standard
Announcements

Project leaderboard and deadline

Some students have asked about the deadline for the class project. The deadline will be 4 weeks from now (the Monday after classes end) on 18 April.

As was mentioned in class, we put up a leaderboard where we ask you to submit the results you achieved on either project (classification scores for Dogs vs. Cats and samples or perplexity scores for the vocal synthesis task). Don’t wait until the end to do so, please put up intermediate results as well!

Standard
Announcements

Final Exam

The final exam is in-class and scheduled for the last day of class, Thursday April 14, 2016, at the usual 9:30-11:30 class time.

You are not allowed to open up your phone or laptop (or any other means to connect to the internet) during the exam, but you are allowed to bring your own 8.5×11 cheat sheet (2-pager).

As a guide (but thing might be different), here are some exams from previous years.

 

 

Standard
Lectures

Lecture 20, March 21st, 2016: Graphical Models

In this lecture we will begin our discussion of probabilistic undirected graphical models.

Please study the following material in preparation for the class:

  • Lecture 5 (5.1 to 5.3) of Hugo Larochelle’s course on Neural Networks.
  • Chapter 16 of the Deep Learning Textbook (important background on probabilistic models).
  • Chapter 17 of the Deep Learning Textbook (Monte-Carlo methods)

Other relevant material:

  •  Lecture 11  of Geoff Hinton’s cousera course on Neural Networks (from Hopfield nets to Boltzmann machines)
Standard
Lectures

Lecture 19, March 17th, 2016: Representation learning

In this lecture we will step back a little on the general notion of representation learning and see how it connects with transfer learning, multi-task learning, the importance of depth, and the need for broad priors.

Please study the following material in preparation for the class:

Other relevant material:

 

Standard
Lectures

Lecture 18, March 14th, 2016: Autoencoders

In this lecture we will continue our discuss of unsupervised learning methods. We will study autoencoders in more detail.

Please study the following material in preparation for the class:

  • Lecture 6 (6.5 to 6.7) of Hugo Larochelle’s course on Neural Networks.
  • Chapter 14 of the Deep Learning Textbook.

Other relevant material:

 

Standard
Lectures

Lecture 17, March 10, 2016: Autoencoders

In this lecture we will begin our discuss of unsupervised learning methods. In particular, we will study a particular kind of neural network known as an autoencoder.

Please study the following material in preparation for the class:

Other relevant material:

  •  Lecture 15a of Geoff Hinton’s coursera course on Neural Networks.

 

Standard