Créditos
5
Tipos
- MAI: Optativa
- MIRI: Optativa
Requisitos
Esta asignatura no tiene requisitos
, pero tiene capacidades previas
Departamento
CS;TSC
Web
https://www.cs.upc.edu/~padro/ahlt/ahlt.html
Profesorado
Responsable
- Lluis Padro Cirera ( padro@cs.upc.edu )
Horas semanales
Teoría
1.5
Problemas
0.5
Laboratorio
1
Aprendizaje dirigido
0
Aprendizaje autónomo
5.3333
Competencias
Genéricas
Académicas
Profesionales
Trabajo en equipo
Razonamiento
Analisis y sintesis
Básicas
Objetivos
-
Learn to apply statistical methods for NLP in a practical application
Competencias relacionadas: CB6, CB8, CT3, CEA10, CEA3, CEA5, CEA7, -
Understand statistical and machine learning techniques applied to NLP
Competencias relacionadas: CT6, CT7, CEA3, CB6, CEP3, CG3, -
Develop the ability to solve technical problems related to statistical and algorithmic problems in NLP
Competencias relacionadas: CB6, CB8, CB9, CT7, CEA10, CEA3, CEA5, CEA7, CG3, -
Understand fundamental methods of Natural Language Processing from a computational perspective
Competencias relacionadas: CB6, CT7, CEA5, CEP4,
Contenidos
-
Statistical Models for NLP
Introduction to statistical modelling for language. Maximum Likelhood models and smooting. Maximum entropy estimation. Log-Linear models -
Distances and Similarities
Distances (and similarities) between linguistic units. Textual, Semantic, and Distributional distances. Semantic spaces (WN, Wikipedia, Freebase, Dbpedia). -
Sequence Predicion
Prediction in word sequences: PoS tagging, NERC. Local classifiers, HMM, global predictors, Log-linear models. -
Syntactic Parsing
Parsing constituent trees: PCFG, CKY vs Inside/outside
Parsing dependency trees: CRFs for parsing. Earley algorithm -
Word Embeddings
Static word embeddings. Word2Vec, Glove
Limitations - need of contextual embeddings -
Recurrent Neural Networks
RNNs for Language Modeling and sequence labeling
Bottleneck problem. LSTMs
Vanishing gradient problem
LSTM-based word embeddings: ELMO -
Convolutional Neural Networks
CNNs for NLP. 1D kernels vs 2D kernels.
stride, padding
Pooling
NLP tasks suitable for CNNs vs RNNs -
Transformers
Vanishing gradient problem in RNN/LSTM
Attention
Tranformer architecture -
Large Language Models
Large Language Models: origin and evolution
Reinforcement Learning from Human Feedback
Fundational vs Instructed LLMs
Use of LLMs in NLP applications: Zero shot, few-shot, fine-tuning
Optimization and efficiency issues -
Ethics - Limitations and Risks of LLMs
Biases
Hallucinations
Security
Environmental costs
Social costs
Actividades
Actividad Acto evaluativo
Distances and Similarities
Distances (and similarities) between linguistic units. Textual, Semantic, and Distributional distances. Semantic spaces (WN, Wikipedia, Freebase, Dbpedia). Latent Semantic Analysis. Word EmbeddingsObjetivos: 4 2
Contenidos:
Teoría
2h
Problemas
0.5h
Laboratorio
0h
Aprendizaje dirigido
0h
Aprendizaje autónomo
0h
Sequence Prediction
These lectures will present sequence labeling models, an important set of tools that is used for sequential tasks. We will present this in the framework of structured prediction (later in the course we will see that the same framework is used for parsing and translation). We will focus on machine learning aspects, as well as algorithmic aspects. We will give special emphasis to Conditional Random Fields.Objetivos: 4 2
Contenidos:
Teoría
2h
Problemas
1h
Laboratorio
4h
Aprendizaje dirigido
0h
Aprendizaje autónomo
8h
Syntax and Parsing
We will present statistical models for syntactic structure, and in general tree structures. The focus will be on probabilistic context-free grammars and dependency grammars, two standard formalisms. We will see relevant algorithms, as well as methods to learn grammars from data based on the structured prediction framework.Objetivos: 4 2
Contenidos:
Teoría
3h
Problemas
1h
Laboratorio
4h
Aprendizaje dirigido
0h
Aprendizaje autónomo
8h
Convolutional Neural Networks
CNNs for NLP. 1D kernels vs 2D kernels stride, padding, poolingObjetivos: 2
Contenidos:
Teoría
2h
Problemas
1h
Laboratorio
3h
Aprendizaje dirigido
0h
Aprendizaje autónomo
6h
Teoría
2h
Problemas
1h
Laboratorio
0h
Aprendizaje dirigido
0h
Aprendizaje autónomo
3h
Final exam
Semana: 15 (Fuera de horario lectivo)
Teoría
0h
Problemas
0h
Laboratorio
0h
Aprendizaje dirigido
0h
Aprendizaje autónomo
0h
Metodología docente
The course will be structured around four different linguistic analysis levels: word level, phrase level, sentence level, and document level. Typical NLP tasks and solutions corresponding to each level will be presented.The first half of the course is devoted to "classical" statistical and ML approaches. The second half of the course revisits the same levels under a deep learning perspective
Theoretical background and practical exercises will be developed in class.
Finally, students will develop a practical project in teams of two students. The goal of the project is to put into practice the methods learned in class, and learn how the experimental methodology that is used in the NLP field. Students have to identify existing components (i.e. data and tools) that can be used to build a system, and perform experiments in order to perform empirical analysis of some statistical NLP method.
Método de evaluación
Final grade = 0.5*FE + 0.5*LPwhere
FE is the grade of the final exam
LP is the grade of the lab project
Bibliografía
Básico
-
Handbook of natural language processing
- Dale, R.; Moisl, H.; Somers, H. (eds.),
Marcel Dekker,
2000.
ISBN: 0824790006
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002071619706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Handbook of natural language processing
- Indurkhya, N.; Damerau, F.J. (eds.),
Chapman and Hall/CRC,
2010.
ISBN: 9781420085938
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001234699706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Speech and language processing: an introduction to natural language processing, computational linguistics, and speech recognition
- Jurafsky, D.; Martin, J.H,
Prentice Hall,
2008.
ISBN: 9332518416
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003460299706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
The Oxford handbook of computational linguistics
- Mitkov, R. (ed.),
Oxford University Press,
2003.
ISBN: 0198238827
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002689009706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Foundations of statistical natural language processing
- Manning, C.D.; Schütze, H,
MIT Press,
1999.
ISBN: 0262133601
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001994779706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Linguistic structure prediction
- Smith, N.A,
Morgan & Claypool,
2011.
ISBN: 9781608454051
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991004001819706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Natural language processing with deep learning
- Manning, C.; See, A,
Stanford University,
https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/ -
Natural language processing
- Collins, M,
Columbia University,
http://www.cs.columbia.edu/~cs4705/ -
Natural language processing
- Titov, I,
Universiteit van Amsterdam,
http://ivan-titov.org/teaching/nlp1-15/ -
Syntactic analysis in language technology: syntactic parsing
- Stymne, S.; Lhoneux, Miryam de,
Uppsala Universitet,
2017.
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001688389706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
The handbook of computational linguistics and natural language processing
- Clark, A.; Fox, C.; Lappin, S. (eds.),
Wiley-Blackwell,
2010.
ISBN: 9781444324044
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001686059706711&context=L&vid=34CSUC_UPC:VU1&lang=ca
Web links
- The course website, includes lecture slides, and links to relevant bibliography and resources. http://www.lsi.upc.edu/~ageno/anlp
Capacidades previas
- Although not mandatory, familiarity with basic concepts and methods of Natural Language Processing is strongly recommended- Good understanding of basic concepts and methods of Machine Learning.
- Advanced programming skills.