Courses given by Magda Gregorova
Module 6c: Data science I - Introduction to machine learning (spring 2019)
Part 1: Mathematical prerequisites
- Intro math quiz
- 1 Functions
- 2 Basic classes of functions
- HW1 Functions
- 3 Sequences and series
- HW2 - Functions2
- 4 Limits
- 5 Derivatives
- HW3 - Derivatives
- 6 Integrals
- 7 Multivariate functions
- G1 Graph links
Part 2: Probability review
Part 3: Introduction to machine learning
HEG: Informatique de gestion
Analyse du SI d'entreprise 2 (spring 2019)
HEG: Économie d'entreprise
Valorisation du système d’information - Datamining (autumn 2018)
Module 7d: Data science II - Selected topics in ML (autumn 2019)
The lecture notes are written by the students following the course. Thanks to them all for preparing these and allowing to make them public.
L1: Basics of probability theory (sets, probability axioms and properties, random variable, density and mass functions, expectation, variance)
scribe: Alex, helper: Lucas
P1: Expected number of coin flips code
scribe: Flavio, helper: Loic
L2: Multiple random variables (joint, marginal, conditional, Bayes rule, independence, covariance, correlation)
scribe: Loic, helper: Lucas
W2: Expectation, variance, joint, marginal, conditional probability
P2: Simpson's paradox
scribe: Alex, helper: Flavio
L3: Parametric distribution families, categorical - Bernoulli, binomial, uniform, categorical
scribe: Flavio, helper: Loic
P3: Conditional independence
scribe: Lucas, helper: Alex
L4: Gaussian, uniform continuous distribution, likelihood, maximum likelihood estimation>
scribe: Lucas, helper: Flavio
W4: Likelihood jupyter notebooks
P4: Density estimation of salaries of basketball players
scribe: Loic, helper: Alex
L5: Clustering - k-means
scribe: Alex, helper: Loic
P5: Distance / proximity measures
scribe: Flavio, helper: Lucas
L6: Clustering - agglomerative hierarchical, DBSCAN, Gaussian mixture (EM)
scribe: Flavio, helper: Alex
P6: Expectation maximization for Gaussian mixture - jupyter notebook
scribe: Lucas, helper: Loic