ELEN0062 - Introduction to machine learning (iML)

Random ML quote

With too little data, you won’t be able to make any conclusions that you trust. With loads of data you will find relationships that aren’t real… Big data isn’t about bits, it’s about talent.

Douglas Merrill



/ 18 Sep. 2019
Assignment 25 Sep. 2019

First assignment

Python resources:

On how to present results

Q&A 02 Oct. 2019
Q&A 09 Oct. 2019
Q&A 16 Oct. 2019
Deadline 20 Oct. 2019

Don't forget to submit your first assignment

23 Oct. 2019

Second assignment (Antonio Sutera is the reference TA for this assignment)

Feedback 13 Nov. 2019

Feedback on the first assignment

17 Nov. 2019

Don't forget to submit your second assignment.

Assignment 20 Nov. 2019

See below for information regarding the third assignment

27 Nov. 2019 [Setup] Find a group, register for the third assignment, register on Kaggle, download the data, make the toy submission.
13 Dec. 2019 End of challenge
15 Dec. 2019 Don't forget to submit your report regarding the challenge.
Deadline TBA Presentations

Third assignment: the challenge

The third project is organized in the form a challenge, where you will compete against each other. All the relevant information can be found on the Kaggle plateform which will hold the challenge.

The project is divided into four parts. All the deadlines can be found in the schedule section above.

  1. Setup for the project
    • Create an account on the Kaggle platform. Use your real name so that we can identify you.
    • Use the link given in the mail to enter the challenge. If you did not receive the email, contact us.
    • Form groups of two and register them on the submission platform.
    • Test the toy example.
  2. Propose the best model you can before the competition deadline.
  3. Submit an archive on the submission platform in tar.gz format, containing a report that describes the different steps of your approach and your main results along with your source code. Use the same ids as for the Kaggle platform. The report must contain the following information:
    • A detailed description of all the approaches that you have used to win the challenge, including the feature engineering you performed.
    • A detailed description of your hyper-parameters optimization approach and your model validation technique.
    • A detailed description of how you proceeded to estimate the AUC of your final models and a comparison with the actual value.
    • A table summarizing the performance of your differents approaches containing for each approach at least the name of the approach, the validation score, the score on the public and the private leaderboard.
    • Any complementary information or figures that you want to mention.
  4. Present succinctly your approach to the rest of the class. (More information coming soon)

Have fun!

How to present data

Presenting data well is key to efficient communication. Here are a few pointers:

Installing Anaconda

There are many ways to install Python on a computer and get all the libraries needed. One quick way is to install anaconda, which comes with all the libraries we will need.

  1. Get the anaconda installer for your operating system. Make sure you install a Python 3.5+ version.
  2. Open a Python console:
    • From a unix command line: python
    • Or open spyder IDE, which comes with anaconda
    Note that you can use the ipython interpreter, which is much easier to work with.
  3. Run the following commands:
    import numpy as np
    import pandas as pd
    import sklearn
    import scipy
    If there is no error, the installation went fine

Cheat sheet for ML in Python

Check out datacamp for more.

Supplementary material

Here is a very scarce list of supplementary material related to the field of machine learning. I tend to update this section when I come across interesting stuff but if you feel like you need more material on some topic, do not hesitate to ask!

Machine learning in general

There are tons of online and accessible material in the domain of machine learning:

Linear regression

The geometry of Least Squares (1 variable)

Note that the ANOVA is a special case of linear models where the input variables are dummy one-hot class variables. Consequently, the basis vector of the column space are orthogonal and the problem reduces to many 1 variable least squares.

Artifical neural networks

There have been three hypes about ANN. The first one was about the perceptrons in the 60s until it was discovered it could not solve a XOR problem. The second hype started with the discovery of backpropagation but it soon became clear that the large and/or deep neural nets were very hard to train. We are in the midts of the third one right now with "deep learning": neural nets with several (many) invisible layers. As a consequence, internet is bursting with resource on the topic, from the simplest models (multi-layer perceptron) to the most advanced architectures (such as GANs), going through more classical ones (such as Convnets and LSTM).

Learning theory (Bias/Variance...)

Support Vector Machines

Unsupervised learning


There are many YouTube channels about ML. Here are a few:


Machine learning requires a solid background in maths, especially in linear algebra, (advanced) probability theory and (multivariable) calculus. There are even more resources on those than on deep learning. Here is a short selection, which emphasizes intuition.

Linear algebra

  • 3 brown 1 blue serie on linear algebra
  • If you prefer paper (or PDF): Practical Linear Algebra: A Geometry Toolbox 2nd Edition by Farin, Gerald, Hansford, Dianne. A K Peters/CRC Press (2004)


Last modified on November 22 2019 09:13