Credits
6
Types
Specialization compulsory (Computer Networks and Distributed Systems)
Requirements
This subject has not requirements
, but it has got previous capacities
Department
AC
Teachers
Person in charge
- Jorge García Vidal ( jorge@ac.upc.edu )
Others
- Jose Maria Barceló Ordinas ( jose.maria.barcelo@upc.edu )
Weekly hours
Theory
3.4
Problems
0.6
Laboratory
0
Guided learning
0
Autonomous learning
7.1
Competences
Computer networks and distributed systems
Generic
Appropiate attitude towards work
Basic
Objectives
Contents
-
Review of probability models
Probability axioms, basic combinatorics, random variables, independence and conditional probability, Bayes formula, total probability. Expected values, variance-covariance matrix, inequalities (Markov, Chebyshev, Chernoff, Jensen), (weak) law large numbers, central limit theorem. Properties of multivariate Gaussian distributions. Entropy, mutual information, KL-divergence, and cross entropy. -
Estimation. Basic Machine Learning techniques for classification and regression
Maximum likelihood and bayesian estimation. Decision functions, Loss, Risk, Empirical Risk minimization. Approximation and estimation. Bias-variance tradeoff. Classification. Linear regression, Logistic regression. k-means, clustering. Random forest. Introduction to Neural networks: FFNN, backprop, SGD, CNN, RNN, Autoencoders -
Linear models and dimensionality reduction
Review of basic linear algebra results, spectral theorem for symmetric matrices, positive definite matrices, matrix calculus, trace operator, SVD, matrix norms, Eckart-Young theorem, condition number, dimensionality reduction, PCA, eigenfaces, Sparse representation, Fourier matrices.
Activities
Activity Evaluation act
Review of probability models
Probability axioms, basic combinatorics, random variables, independence and conditional probability, Bayes formula, total probability. Expected values, variance-covariance matrix, inequalities (Markov, Chebyshev, Chernoff, Jensen), (weak) law large numbers, entropy and mutual information. Cross entropy as a cost function. Properties of Gaussian distributions, central limit theorem.Objectives: 1
Contents:
Theory
8h
Problems
2h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Linear models and dimensionality reduction
Review of basic linear algebra results, spectral theorem for symmetric matrices, positive definite matrices, matrix calculus, trace operator, SVD, matrix norms, Eckart-Young theorem, condition number, dimensionality reduction, PCA, eigenfaces, Sparse representation, Fourier matrices, Graph signal processingObjectives: 1
Contents:
Theory
8h
Problems
2h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Estimation. Basic Machine Learning techniques for regression and classification
Maximum likelihood and bayesian estimation. Decision functions, Loss, Risk, Empirical Risk minimization. Approximation and estimation. Bias-variance tradeoff. Classification. Linear regression. Basic neural networks architectures.
Theory
26.9h
Problems
4.1h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Homework1
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
19.9h
Homework 2
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
20h
Homework 3
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
20h
Teaching methodology
Some materials will be posted online. The main results will be explained in the blackboard. Classes with problem solving and application examples.Evaluation methodology
The assessment is based on the development of 3 projects and two controls (C1, C2), one in the middle of the course and one at the end of the course, which are marked between 0 and 10. Each project will be assessed (0=<mark=<10). The final mark for the course (FM) is calculated as:FM = 0.2*(C1+C2+P1+P2+P3)
In this course the proposed projects will be:
* P1: Basic Probability, Information Theory and Linear Algebra,
* P2: Basic estimation methods and Machine Learning.
* P3: Neural Networks.
Bibliography
Basic
-
Information theory, inference, and learning algorithms
- MacKay, D.J.C,
Cambridge University Press,
2003.
ISBN: 0521642981
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002876809706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
An introduction to probability theory and its applications
- Feller, W,
John Wiley and Sons,
1968.
ISBN: 0471257117
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991000036749706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Pattern recognition and machine learning
- Bishop, C.M,
Springer,
2006.
ISBN: 0387310738
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003157379706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Linear Algebra and learning from data
- G. Strang,
Wellesley-Cambridge,
2020.
ISBN: 9780692196380
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991004193269706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
An Introduction to Statistical Learning
- James, G ... [et al.],
Springer,
2021.
ISBN: 9781071614174
https://ebookcentral-proquest-com.recursos.biblioteca.upc.edu/lib/upcatalunya-ebooks/detail.action?pq-origsite=primo&docID=6686746