This subject aims to familiarize the student with the practical aspects of Deep Learning (DL) techniques. The theory classes will refresh the basic concepts of DL (CNNs, LSTMs, etc.), assuming some prior knowledge. Also, the most popular architectures will be introduced, as well as network configurations that have proved useful for specific problems. In the practical part, the student will have to carry out experiments using DL libraries, and experiment with the various components that have been proposed in the field. The lab sessions will be done on computational resources of the Barcelona Supercomputing Center (BSC), which includes cutting-edge technologies.
Teachers
Others
Javier Béjar Alonso (
)
Weekly hours
Theory
1
Problems
0
Laboratory
2
Guided learning
0.38
Autonomous learning
6
Competences
Technical Competences of each Specialization
Professional
CEP3 - Capacity for applying Artificial Intelligence techniques in technological and industrial environments to improve quality and productivity.
CEP4 - Capability to design, write and report about computer science projects in the specific area of ??Artificial Intelligence.
Transversal Competences
Reasoning
CT6 - Capability to evaluate and analyze on a reasoned and critical way about situations, projects, proposals, reports and scientific-technical surveys. Capability to argue the reasons that explain or justify such situations, proposals, etc..
Analisis y sintesis
CT7 - Capability to analyze and solve complex technical problems.
Objectives
Understand the various techniques that can be integrated into a deep learning system, and know how to experiment with them coherently in a realistic production environment through the use of third-party libraries.
Related competences:
CEP3,
CT7,
Be able to understand scientific articles from the area of deep learning, to extract the most relevant conclusions, and to derive possible applications or limitations.
Related competences:
CEP4,
CT6,
Contents
Convolutional Neural Networks
We will review the main aspects of CNNs. How they work, why, and how can they be improved.
Recurrent Neural Networks
We will review the main aspects of RNNs. How they work, why, and how can they be improved.
Transfer Learning
We will review several ways in which neural network embeddings can be reused, the pros and cons.
HPC&DL
We will review basic concepts of High Performance Computing in the context of Deep Learning.
Transformer Networks
Introduction to Transformer Networks
Activities
ActivityEvaluation act
Practical experimentation
Experimentation using deep learning libraries, and reporting of the relevant conclusions. Objectives:1 Week:
13
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
2.6h
Autonomous learning
15h
Theoretical comprehension
Read a relevant article in the field of deep learning, describe and present the main contributions, as well as possible future work lines or limitations of the same. Objectives:2 Week:
13
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
2h
Autonomous learning
9h
Review of Multilayer Perceptron and Convolutiona Neural Networks
Theory
3h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
3h
Lab on Multilayer Perceptron and Convolutional Neural Networks
Theory
0h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
9h
Review of Recurrent Neural Networks
Theory
3h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
3h
Lab on Recurrent Neural Networks
Theory
0h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
9h
Review of Neural Embedding Spaces
Theory
3h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
3h
Lab on Neural Embedding Spaces
Theory
0h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
9h
Review of HPC for Deep Learning
Theory
3h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
3h
Lab on HPC for Deep Learning
Theory
0h
Problems
0h
Laboratory
3h
Guided learning
0h
Autonomous learning
9h
Teaching methodology
This subject has a theoretical component and a practice.
The theoretical component consists of face-to-face classes where the teacher will review concepts of Deep Learning, present applications, and other recent trends in the field. At the end of the course, students will have to read and analyse articles from Deep Learning to demonstrate the knowledge learned.
The practical component is composed by individual practices, where students will have to experiment with the various techniques of Deep Learning. Based on simple experiments, and using popular Deep Learning libraries (e.g., Keras, TensorFlow, Theano, Caffe), the students will test the effects of the various available techniques.
Evaluation methodology
This subject will be evaluated taking into account the theoretical (25%) and practical (75%) aspects.
For the theoretical part, students must read an article from Deep Learning (proposed or validated by the teacher) and do a presentation detailing the main contributions to the class. They will also have to do a critical analysis of the article, detailing aspects that could be done differently, future work that could be derived from the paper, or limitations of the same methodology.
For the practical part, the students will have to do a summary of the experiments realized in the practices, detailing the results obtained in each experiment, and interpreting these results.
Basic concepts of neural networks (SGD, back-propagation, loss functions) and machine learning (classification, regression, evaluation methodologies) are required.
Students must be able to program autonomously (Python), to work on a remote server through a terminal (ssh, bash), and to interact with third-party libraries.