Credits
6
Types
Elective
Requirements
This subject has not requirements
, but it has got previous capacities
Department
EIO;CS
Weekly hours
Theory
3
Problems
0
Laboratory
0
Guided learning
0.2
Autonomous learning
6
Objectives
-
Understand the foundations of Kernel-Based Learning Methods
Related competences: CG3, CEC1, CEC3, CTR6, -
Get acquainted with specific kernel-based methods, such as the Support Vector Machine
Related competences: CG3, CTR4, -
Know methods for kernelizing existing statistical or machine learning algorithms
Related competences: CTR6, -
Know the theoretical foundations of kernel functions and kernel methods
Related competences: CG3, -
Know the structure of the main unsupervised learning problems.
Related competences: CG3, CEC1, CTR4, CTR6, -
Learn different methods for dimensionality reduction when the standard assumptions in classical Multivariate Analysis are not fulfilled
Related competences: CG3, CEC1, CEC3, CTR4, CTR6, -
Learn how to combine dimensionality reduction techniques with prediction algorithms
Related competences: CG3, CEC1, CEC3, CTR4, CTR6,
Contents
-
Introduction to Kernel-Based Learning
This topic introduces the student the foundations of Kernel-Based Learning focusing on Kernel Linear Regression -
The Support Vector Machine (SVM)
This topic develops Support Vector Machine (SVM) for classification, regression and novelty detection -
Kernels: properties & design
This topic defines kernel functions, their properties and construction. Introduces specific kernels for different data types, such as real vectors, categorical information, feature subsets, strings, probability distributions and graphs. -
Kernelizing ML algorithms
This topic reviews different techniques for kernelizing existent algorithms -
Theoretical underpinnings
This topic reviews the basic theoretical underpinnings of kernel-based methods, focusing on statistical learning theory -
Introduction to unsupervised learning
Unsupervised versus supervised learning. Main problems in unsupervised learning (density estimation, dimensionality reduction, latent variables, clustering). -
Nonlinear dimensionality reduction
a. Principal curves.
b. Local Multidimensional Scaling.
c. ISOMAP.
d. t-Stochastic Neighbor Embedding.
e. Applications: (i) Visualization of high- or infinite-dimensional data. (ii) Exploratory analysis of functional data in Demography. -
Dimensionality reduction with sparsity
a. Matrix decompositions, approximations, and completion.
b. Sparse Principal Components and Canonical Correlation.
c. Applications: (i) Recommender systems. (ii) Estimating causal effects. -
Prediction after dimensionality reduction.
a. Reduced rank regression and canonical correlation.
b. Principal Component regression.
c. Distance based regression.
Activities
Activity Evaluation act
Theory
4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Theory
3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Theory
4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Theory
4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Introduction to unsupervised learning
Theory
3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Nonlinear dimensionality reduction 1
Theory
4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Nonlinear dimensionality reduction 2
Theory
3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Dimensionality reduction with sparsity 1
Theory
4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Dimensionality reduction with sparsity 2
Theory
3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Prediction after dimensionality reduction 1
Theory
4h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
6h
Prediction after dimensionality reduction 2
Theory
2h
Problems
0h
Laboratory
0h
Guided learning
0.2h
Autonomous learning
6h
Teaching methodology
Learning is done through a combination of theoretical explanations and their application to practising exercises and real cases. The lectures will develop the necessary scientific knowledge, including its application to problem solving. These problems constitute the practical work of the students on the subject, which will be developed as autonomous learning. The software used will be primarily R.Evaluation methodology
The course evaluation will be based on the marks obtained in the practical works delivered during the semester plus the mark obtained in the written test for global evaluation.Each practical work will lead to the drafting of the corresponding written report which will be evaluated by the teachers resulting in a mark denoted P.
The exam will take place at the end of the semester and will evaluate the assimilation of the basic concepts on the whole subject, resulting in a mark denoted T.
The final mark will be obtained as:
60% x P + 40% x T
Bibliography
Basic
-
Kernel methods for pattern analysis
- Shawe-Taylor, J.; Cristianini, N,
Cambridge University Press,
2004.
ISBN: 0521813972
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002747459706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Learning with Kernels: support vector machines, regularization, optimization, and beyond
- Schölkopf, B.; Smola, A.J,
The MIT Press,
2002.
ISBN: 0262194759
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002368479706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Deep learning: methods and applications (pp.197-387)
- Deng, L.; Yu, D.,
Now Publishers Inc.,
2014.
ISBN: 9781601988140
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991004151719706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Deep learning (Vol. 1)
- Goodfellow, I.; Bengio, Y.; Courville, A,
The MIT Press,
2016.
ISBN: 9780262035613
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991004107709706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
The Elements of statistical learning: data mining, inference, and prediction
- Hastie, T.; Tibshirani, R.; Friedman, J,
Springer,
cop. 2009.
ISBN: 9780387952840
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003549679706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Statistical learning with sparsity: the lasso and generalizations
- Hastie, Trevor; Tibshirani, Robert; Wainwright, Martin,
CRC Press,
2015.
ISBN: 9781498712170
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991004212899706711&context=L&vid=34CSUC_UPC:VU1&lang=ca
Complementary
-
Learning from data: concepts, theory, and methods
- Cherkassky, V.S.; Mulier, F,
John Wiley,
2007.
ISBN: 0471681822
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003624509706711&context=L&vid=34CSUC_UPC:VU1&lang=ca
Web links
- Official website of the R programming language http://cran.r-project.org/
- Repository of educational videos (in English) on the topics of the course. http://videolectures.net/Top/Computer_Science/Machine_Learning/