The main goal of Machine Learning (ML) is the development of systems that are able to autonomously change their behavior based on experience. ML offers some of the more effective techniques for knowledge discovery in large data sets. ML has played a fundamental role in areas such as bioinformatics, information retrieval, business intelligence and autonomous vehicle development.
The main goal of this course is to study the computational, mathematical and statistical foundations of ML, which are essential for the theoretical analysis of existing learning algorithms, the development of new algorithms and the well-founded application of ML to solve real-world problems.
1 Introduction
2 Generalization
2.1 Bayesian decision theory
2.2 Estimation
2.3 Linear models
2.4 Performance evaluation
3 Perception and representation
3.1 Feature extraction and selection
3.2 Kernel methods
3.3 Representation learning
4 Learning
4.1 Support vector learning
4.2 Random forest learning
4.3. Neural network learning
5 Discovering
5.1 Mixture densities
5.2 Latent topic models
5.3 Matrix factorization
6 Implementing
6.1 Experimental design
6.2 Large scale machine learning
Week | Topic | Material | Assignments |
---|---|---|---|
Aug 12-19 | 1. Introduction | Brief Introduction to ML (slides) Jeremy Howard: The wonderful and terrifying implications of computers that can learn | Assignment 1 |
Aug 26 | 2.1 Bayesian decision theory | [Alp10] Chap 3 (slides) | |
Sep 2 | 2.2 Estimation | [Alp10] Chap 4 (slides) Bias and variance (IPython notebook) | Assignment 2 |
Sep 9 | 2.3 Linear models | [Alp10] Chap 10 (slides) | |
Sep 16 | 3.2 Kernel methods | Introduction to kernel methods (slides) [Alp10] Chap 13 (slides) | |
Sep 23 | 4.1 Support vector learning | [Alp10] Chap 13 (slides) An introduction to ML, Smola Support Vector Machine Tutorial, Weston | Assignment 3 |
Sep 30 | 3.1 Feature extraction and selection | Feature Engineering, Léon Bottou (slides) [Alp10] Chap 6 (slides) | |
Oct 7 | 4.3. Neural network learning | [Alp10] Chap 11 (slides) Quick and dirty introduction to neural networks (IPython notebook) | |
Oct 14-21 | 3.3 Representation learning | Deep Learning, Andrew Ng (slides) Representation learning for histopathology image analysis, Arévalo and González (slides) Deep Learning Tutorial, Yann LeCun (slides) How we're teaching computers to understand pictures, Li Fei-Fei (slides) | Assignment 4 |
Oct 28 | 4.2 Random forest learning | [HTF09] Chap 15 (book) Random Forest and Boosting, Trevor Hastie (slides) Trees and Random Forest, Markus Kalisch (slides1, slides2) | |
Nov 4 | 5.1 Mixture densities | [Alp10] Chap 7 (slides) | |
Nov 11 | 5.2 Latent topic models 5.3 Matrix factorization | Latent Semantic Analysis, CS158 Pomona College (slides) Latent Semantic Variable Models, Thomas Hofmann (videolecture) Non-negative Matrix Factorization for Multimodal Image Retrieval, Fabio González (slides) | |
Nov 18 | 5.3 Matrix factorization 6.2 Large scale machine learning | Two-way Multimodal Online Matrix Factorization, Jorge Vanegas (slides) Online Kernel Matrix Factorizatioon, Esteban Paez (slides) | |
Nov 25 | 6.1 Experimental design | [Alp10] Chap 19 (slides) |