Credits
6
Types
Compulsory
Requirements
This subject has not requirements
, but it has got previous capacities
Department
CS
This course introduces the fundamental concepts of neural networks and deep learning, including basic architectures, learning algorithms, and practical applications, providing students with the foundations needed to understand and apply these methods.
Teachers
Person in charge
- Luis Antonio Belanche Muñoz ( belanche@cs.upc.edu )
Others
- Joan Llop Palao ( joan.llop@upc.edu )
Weekly hours
Theory
2
Problems
0
Laboratory
2
Guided learning
0
Autonomous learning
6
Competences
Transversals
Basic
Especifics
Generic
Objectives
-
To know how to identify a data analysis problem and solve it from start to finish (end to end)
Related competences: CG4, CG8, CG9, CT5, CE13, CE15, -
To know the theoretical foundations of neural networks as models of machine learning
Related competences: CE26, CG4, CE01, CE12, CE13, CE18, CE20, -
To know and understand the fields of application of neural networks and know how to develop solutions to specific problems
Related competences: CG9, CE12, CE15, CE18, -
To know how to design solutions for problems related to language, image or sound
Related competences: CE26, CG4, CG8, CG9, CT5, CB3, CE13, CE15, CE18,
Contents
-
General concepts of machine learning
Review of the general theoretical concepts of machine learning. Learning as an optimization problem. Bayesian interpretation of the learning problem. Generalized linear models. -
Foundations of artificial neural networks.
Foundations of artificial neural networks. Basic biological concepts. McCulloch-Pitts model. Cognitive and computational implications. Lippmann networks. Loss functions, activation functions. -
Feed-forward neural networks
Feed-forward neural networks.
Linear networks (I): the Perceptron.
Linear networks (II): the Delta rule.
Multilayer Perceptrons and Backpropagation.
Descent of gradients and variants.
Other optimizers: pseudo-Newton, CG, Rprop.
Networks of radial basis functions.
Autoencoders and VAEs.
Support vector machines.
Convolutional networks. -
Advanced neural networks
Hopfield networks.
Graph neural networks.
Activities
Activity Evaluation act
Laboratory classes
Examples of the application of the concepts seen in theory classes. Explanations related to the triats programming languages. Additional explanations relevant to the subject: practical skills, experimental methodology, etc.Objectives: 1 4
Contents:
Theory
0h
Problems
0h
Laboratory
28h
Guided learning
0h
Autonomous learning
25h
Partial Exam
Partial exam (in the middle of the semester) that covers all the syllabus seen up to that point, or a little earlier, at the teacher's discretion. The exam will take place in a laboratory classroom and may consist of theory, methodological or practical questions.Objectives: 1 2 3
Week: 9 (Outside class hours)
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Final Exam
Final exam (during the period of final exams) that covers all the syllabus seen in the subject. The exam will be held in a theory classroom and may consist of theory or methodological questions.Objectives: 1 2 3 4
Week: 15 (Outside class hours)
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Teaching methodology
The course delves into one of the most important machine learning paradigms today: artificial neural networks, with a strong foundation in probability, statistics and mathematics. The theory is introduced in lectures where the teacher explains the concepts. These concepts are put into practice in laboratory classes, where the student learns to develop machine learning solutions to real problems of some complexity. Students must work on and hand in a project at the end of the course.Evaluation methodology
The course is graded as follows:P = Mark of the partial exam (control)
F = Mark of the final exam
T = Mark of the practical work
Exams mark = 0.6F+0.4P if F< P or F=P
F if F>P
Final mark = 40% T + 60% Exam mark
Reassessment: only those people who, having taken the final exam (an NP is not valid), have an exam mark lower than 4 can present themselves for the reassessment. The maximum exam mark that can be obtained in the reassessment is 7.
Bibliography
Basic
-
Pattern recognition and machine learning
- Bishop, Christopher M,
Springer,
cop. 2006.
ISBN: 0387310738
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003157379706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Deep learning
- Goodfellow, Ian; Bengio, Yoshua; Courville, Aaron,
MIT Press,
2016.
ISBN: 9780262035613
https://www.deeplearningbook.org/ -
Neural networks and deep learning : a textbook
- Aggarwal, Charu C,
Springer,
2023.
ISBN: 9783031296420
https://ebookcentral-proquest-com.recursos.biblioteca.upc.edu/lib/upcatalunya-ebooks/detail.action?pq-origsite=primo&docID=30620507 -
Neural networks and learning machines
- Haykin, Simon S,
Prentice Hall,
cop. 2009.
ISBN: 9780131471399
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003533949706711&context=L&vid=34CSUC_UPC:VU1&lang=ca