Credits
6
Types
- BDMA: Compulsory
- MDS: Compulsory
Requirements
This subject has not requirements
, but it has got previous capacities
Department
CS
The course is divided into conceptual parts, corresponding to several kinds of fundamental tasks: supervised learning (classification and regression) and unsupervised learning (clustering, density estimation). Specific modelling techniques studied include artificial neural networks and support vector machines. An additional goal is getting acquainted with python and its powerful machine learning libraries.
Teachers
Person in charge
- Marta Arias Vicente ( marias@cs.upc.edu )
Others
- Manel Gil Sorribes ( manel.gil.sorribes@upc.edu )
Weekly hours
Theory
1.9
Problems
0
Laboratory
1.9
Guided learning
0
Autonomous learning
6.86
Competences
Information literacy
Third language
Entrepreneurship and innovation
Basic
Generic
Especifics
Objectives
-
Formulate the problem of (machine) learning from data, and know the different machine learning tasks, goals and tools.
Related competences: CB6, CB7, CB8, CB10, -
Ability to decide, defend and criticize a solution to a machine learning problem, arguing the strengths and weaknesses of the approach. Additionally, ability to compare, judge and interpret a set of results after making a hypothesis about a machine learning problem
Related competences: CT4, CT5, CT1, CG2, CE6, CE7, CE10, CE12, CE13, CB6, CB7, CB8, CB9, CB10, -
To be able to solve concrete machine learning problems with available open-source software
Related competences: CT4, CT5, CG2, CE6, CE7, CE10, CE12, CE13, CB6, CB9,
Contents
-
Introduction to Machine Learning
General information and basic concepts. Overview to the problems tackled by machine learning techniques. Supervised learning (classification and regression), unsupervised learning (clustering and density estimation) and semi-supervised learning (reinforcement and transductive). Examples. -
Supervised machine learning theory
The supervised Machine Learning problem setup. Classification and regression problems. Bias-variance tradeoff. Regularization. Overfitting and underfitting. Model selection and resampling methods. -
Linear methods for regression
Error functions for regression. Least squares: analytical and iterative methods. Regularized least squares. The Delta rule. Examples. -
Linear methods for classification
Error functions for classification. The perceptron algorithm. Novikoff's theorem. Separations with maximum margin. Generative learning algorithms and Gaussian discriminant analysis. Naive Bayes. Logistic regression. Multinomial regression. -
Artificial neural networks
Artificial neural networks: multilayer perceptron and a peak into deep learning. Application to classification and to regression problems. -
Kernel functions and support vector machines
Definition and properties of Kernel functions. Support vector machines for classification and regression problems. -
Unsupervised machine learning
Unsupervised machine learning techniques. Clustering algorithms: EM algorithm and k-means algorithm. -
Ensemble methods
Bagging and boosting methods, with an emphasis on Random Forests
Activities
Activity Evaluation act
Teaching methodology
The course introduces the most important concepts in machine learning and its most relevant techniques with a solid foundation in math. All the theory and concepts are illustrated and accompanied by real-world examples and code using open source libraries.The theory is introduced in lectures where the teacher exposes the concepts, and during the lab sessions students will see many examples on how to apply the methods and theory learned, as well as code their own solutions to exercises proposed by the teacher.
Students have to work on a course project using a real-world dataset.
Evaluation methodology
The course is graded as follows:P = Grade of mid-term
F = Score of the final exam
L = Score for the practical work
final grade = 25% P + 50% F + 25% L
Bibliography
Basic
-
Pattern recognition and machine learning
- Bishop, C.M,
Springer,
2006.
ISBN: 0387310738
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003157379706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Learning from data: concepts, theory, and methods
- Cherkassky, V.S.; Mulier, F,
John Wiley,
2007.
ISBN: 9780471681823
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003624509706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Introduction to machine learning
- Alpaydin, E,
The MIT Press,
2020.
ISBN: 9780262043793
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991004193529706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Machine learning: a probabilistic perspective
- Murphy, K.P,
MIT Press,
2012.
ISBN: 9780262018029
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003972109706711&context=L&vid=34CSUC_UPC:VU1&lang=ca
Complementary
-
Neural networks and learning machines
- Haykin, S.S,
Prentice Hall,
2009.
ISBN: 9780131471399
https://discovery.upc.edu/discovery/fulldisplay?docid=alma91003533949706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
The elements of statistical learning: data mining, inference, and prediction
- Hastie, T.; Tibshirani, R.; Friedman, J,
Springer,
2009.
ISBN: 9780387952840
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003549679706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Pattern classification
- Duda, R.O.; Hart, P.E.; Stork, D.G,
John Wiley & Sons,
2001.
ISBN: 9780471056690
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002131619706711&context=L&vid=34CSUC_UPC:VU1&lang=ca
Web links
- Official website of a popular library for machine learning in python which we will use extensively https://scikit-learn.org/stable/
Previous capacities
Elementary notions of probability and statistics.Elementary linear algebra and real analysis
Good programming skills in a high-level language