Credits
6
Types
Elective
Requirements
This subject has not requirements
, but it has got previous capacities
Department
CS
statistical uncertainty. Machine learning is a meeting point from different disciplines: multivariate statistics, algorithms and mathematical optimization, among others.
The subject reviews some foundations and then delves into various modern non-linear learning techniques ranging from modern neural networks to advanced kernel-based learning methods and the latest developments in ensemble methods. It also aims to provide a rather unified view of the area and possible future prospects.
Teachers
Person in charge
- Luis Antonio Belanche Muñoz ( belanche@cs.upc.edu )
Others
- Jamie Arjona Martínez ( jamie.arjona@upc.edu )
Weekly hours
Theory
3.2
Problems
0
Laboratory
1
Guided learning
0
Autonomous learning
7.38
Competences
Information literacy
Third language
Basic
Generic
Especifics
Objectives
-
Advanced machine learning methods
Related competences: CT4, CT5, CE8, CE9, CE10, CB6, CB10, -
Bayesian statistics
Related competences: CT4, CT5, CE5, CE8, CE10, CB7, -
Optimization of neural networks and support vector machines
Related competences: CT4, CT5, CG2, CE3, CE5, CE11, CE8, CE9, CE10, CB6, CB7, CB10, -
Linear models and generalized nonparametric linear models for regression
Related competences: CT5, CE5, CE10, CB10, -
Data cleaning
Related competences: CT4, CG2, CE3, CE11, CE8, CE9, CB6,
Contents
-
Theoretical refresher of machine learning. Introduction to Bayesian machine learning.
Introduction to Bayesian thinking for machine learning. Learning by solving a regularized problem. Illustrative example. -
Learning in functional spaces
Reproducing kernel Hilbert spaces. The representer theorem. Example 1: Kernel ridge regression. Example 2: The Perceptron and the kernel Perceptron. -
Fundamental kernel functions in R^d.
Description and demonstration of fundamental kernel functions in R^d. Polynomial and Gaussian kernels. General properties of kernel functions. -
The support vector machine for classification, regression and novelty detection
The support vector machine (SVM) is the flagship in kernel methods. Its versions for classification, regression and novelty detection are fully explained and demonstrated. -
Kernel functions for diferent data types
Some kernel functions for different data types are presented and demonstrated, such as text, trees, graphs, categorical variables, and many others. -
Other kernel-based learning algorithms
Additional kernel-based learning methods are explained, such as kernel PCA and kernel FDA. These are illustrated in several application examples. -
Introduction to deep neural networks. Autoencoders and Variational Autoencoders.
Introduction to deep neural networks: reminder of fundamental neural network theory and optimization, qualitative description, loss functions, activation functions, regularization and best practices.
Autoencoders and Variational Autoencoders. -
Special networks: (New) Hopfield neural networks and KANs
Special networks: (New) Hopfield neural networks and KANs. -
Ensemble methods: baggers, boosters and stackers
This activity cover the basic and modern developments in ensemble methods, including baggers, boosters and stackers. -
Advanced and hybrid techniques in deep networks and kernel methods
Other methods are briefly introduced, such as the RVM and GPs. Nyström acceleration and Random Fourier features. Deep kernel learning and maybe others.
Activities
Activity Evaluation act
Theoretical lectures
Objectives: 1 2 4 3
Contents:
- 1 . Theoretical refresher of machine learning. Introduction to Bayesian machine learning.
- 2 . Learning in functional spaces
- 3 . Fundamental kernel functions in R^d.
- 4 . The support vector machine for classification, regression and novelty detection
- 5 . Kernel functions for diferent data types
- 6 . Other kernel-based learning algorithms
- 7 . Introduction to deep neural networks. Autoencoders and Variational Autoencoders.
- 8 . Special networks: (New) Hopfield neural networks and KANs
- 10 . Advanced and hybrid techniques in deep networks and kernel methods
Theory
40h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
25h
Practice lectures
Objectives: 1 3 5
Contents:
- 1 . Theoretical refresher of machine learning. Introduction to Bayesian machine learning.
- 3 . Fundamental kernel functions in R^d.
- 4 . The support vector machine for classification, regression and novelty detection
- 5 . Kernel functions for diferent data types
- 6 . Other kernel-based learning algorithms
- 7 . Introduction to deep neural networks. Autoencoders and Variational Autoencoders.
- 8 . Special networks: (New) Hopfield neural networks and KANs
Theory
0h
Problems
0h
Laboratory
16h
Guided learning
0h
Autonomous learning
16h
Term project
Objectives: 1 2 4 3 5
Contents:
- 1 . Theoretical refresher of machine learning. Introduction to Bayesian machine learning.
- 2 . Learning in functional spaces
- 3 . Fundamental kernel functions in R^d.
- 4 . The support vector machine for classification, regression and novelty detection
- 5 . Kernel functions for diferent data types
- 6 . Other kernel-based learning algorithms
- 7 . Introduction to deep neural networks. Autoencoders and Variational Autoencoders.
- 8 . Special networks: (New) Hopfield neural networks and KANs
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
32h
Teaching methodology
The course delves into the most important machine learning paradigms with a solid foundation in probability, statistics and math. The theory is introduced in lectures where the teacher exposes the concepts. These concepts are put into practice in the laboratory classes, in which the student learns to develop machine learning solutions to real problems of a certain complexity.Students have to work and deliver a term project.
Evaluation methodology
The course is graded as follows:F = Grade of the final exam
P1, P2, P3 = Grade of the practical works (1, 2 and 3)
Final grade = 25% F + 25% P1 + 25% P2 + 25% P3
Bibliography
Basic
-
Pattern recognition and machine learning
- Bishop, Christopher M,
Springer,
cop. 2006.
ISBN: 0387310738
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003157379706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Kernel methods for pattern analysis
- Shawe-Taylor, John; Cristianini, Nello,
Cambridge University Press,
2004.
ISBN: 0521813972
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002747459706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Neural networks and deep learning : a textbook
- Aggarwal, Charu C,
Springer,
2023.
ISBN: 9783031296420
https://ebookcentral-proquest-com.recursos.biblioteca.upc.edu/lib/upcatalunya-ebooks/detail.action?pq-origsite=primo&docID=30620507 -
Deep learning : foundations and concepts
- Bishop, Christopher M; Bishop, Hugh,
Springer,
[2024].
ISBN: 9783031454677
https://ebookcentral-proquest-com.recursos.biblioteca.upc.edu/lib/upcatalunya-ebooks/detail.action?pq-origsite=primo&docID=30853138 -
Ensemble Methods: Foundations and Algorithms
- Zhou. Zhi-Hua,
CRC Press,
2025.
ISBN: 9781003587774
https://www-taylorfrancis-com.recursos.biblioteca.upc.edu/books/mono/10.1201/9781003587774/ensemble-methods-zhi-hua-zhou