Credits
5
Types
- MAI: Elective
- MIRI: Elective
Requirements
This subject has not requirements
, but it has got previous capacities
Department
CS;TSC
Web
https://www.cs.upc.edu/~padro/ahlt/ahlt.html
Teachers
Person in charge
- Lluis Padro Cirera ( padro@cs.upc.edu )
Weekly hours
Theory
1.5
Problems
0.5
Laboratory
1
Guided learning
0
Autonomous learning
5.3333
Competences
Generic
Academic
Professional
Teamwork
Reasoning
Analisis y sintesis
Basic
Objectives
-
Learn to apply statistical methods for NLP in a practical application
Related competences: CB6, CB8, CT3, CEA10, CEA3, CEA5, CEA7, -
Understand statistical and machine learning techniques applied to NLP
Related competences: CT6, CT7, CEA3, CEP3, CG3, CB6, -
Develop the ability to solve technical problems related to statistical and algorithmic problems in NLP
Related competences: CB6, CB8, CB9, CT7, CEA10, CEA3, CEA5, CEA7, CG3, -
Understand fundamental methods of Natural Language Processing from a computational perspective
Related competences: CB6, CT7, CEA5, CEP4,
Contents
-
Statistical Models for NLP
Introduction to statistical modelling for language. Maximum Likelhood models and smooting. Maximum entropy estimation. Log-Linear models -
Distances and Similarities
Distances (and similarities) between linguistic units. Textual, Semantic, and Distributional distances. Semantic spaces (WN, Wikipedia, Freebase, Dbpedia). -
Sequence Predicion
Prediction in word sequences: PoS tagging, NERC. Local classifiers, HMM, global predictors, Log-linear models. -
Syntactic Parsing
Parsing constituent trees: PCFG, CKY vs Inside/outside
Parsing dependency trees: CRFs for parsing. Earley algorithm -
Word Embeddings
Static word embeddings. Word2Vec, Glove
Limitations - need of contextual embeddings -
Recurrent Neural Networks
RNNs for Language Modeling and sequence labeling
Bottleneck problem. LSTMs
Vanishing gradient problem
LSTM-based word embeddings: ELMO -
Convolutional Neural Networks
CNNs for NLP. 1D kernels vs 2D kernels.
stride, padding
Pooling
NLP tasks suitable for CNNs vs RNNs -
Transformers
Vanishing gradient problem in RNN/LSTM
Attention
Tranformer architecture -
Large Language Models
Large Language Models: origin and evolution
Reinforcement Learning from Human Feedback
Fundational vs Instructed LLMs
Use of LLMs in NLP applications: Zero shot, few-shot, fine-tuning
Optimization and efficiency issues -
Ethics - Limitations and Risks of LLMs
Biases
Hallucinations
Security
Environmental costs
Social costs
Activities
Activity Evaluation act
Distances and Similarities
Distances (and similarities) between linguistic units. Textual, Semantic, and Distributional distances. Semantic spaces (WN, Wikipedia, Freebase, Dbpedia). Latent Semantic Analysis. Word EmbeddingsObjectives: 4 2
Contents:
Theory
2h
Problems
0.5h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Sequence Prediction
These lectures will present sequence labeling models, an important set of tools that is used for sequential tasks. We will present this in the framework of structured prediction (later in the course we will see that the same framework is used for parsing and translation). We will focus on machine learning aspects, as well as algorithmic aspects. We will give special emphasis to Conditional Random Fields.Objectives: 4 2
Contents:
Theory
2h
Problems
1h
Laboratory
4h
Guided learning
0h
Autonomous learning
8h
Syntax and Parsing
We will present statistical models for syntactic structure, and in general tree structures. The focus will be on probabilistic context-free grammars and dependency grammars, two standard formalisms. We will see relevant algorithms, as well as methods to learn grammars from data based on the structured prediction framework.Objectives: 4 2
Contents:
Theory
3h
Problems
1h
Laboratory
4h
Guided learning
0h
Autonomous learning
8h
Convolutional Neural Networks
CNNs for NLP. 1D kernels vs 2D kernels stride, padding, poolingObjectives: 2
Contents:
Theory
2h
Problems
1h
Laboratory
3h
Guided learning
0h
Autonomous learning
6h
Theory
2h
Problems
1h
Laboratory
0h
Guided learning
0h
Autonomous learning
3h
Final exam
Week: 15 (Outside class hours)
Theory
0h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Teaching methodology
The course will be structured around four different linguistic analysis levels: word level, phrase level, sentence level, and document level. Typical NLP tasks and solutions corresponding to each level will be presented.The first half of the course is devoted to "classical" statistical and ML approaches. The second half of the course revisits the same levels under a deep learning perspective
Theoretical background and practical exercises will be developed in class.
Finally, students will develop a practical project in teams of two students. The goal of the project is to put into practice the methods learned in class, and learn how the experimental methodology that is used in the NLP field. Students have to identify existing components (i.e. data and tools) that can be used to build a system, and perform experiments in order to perform empirical analysis of some statistical NLP method.
Evaluation methodology
Final grade = 0.5*FE + 0.5*LPwhere
FE is the grade of the final exam
LP is the grade of the lab project
Bibliography
Basic
-
Handbook of natural language processing
- Dale, R.; Moisl, H.; Somers, H. (eds.),
Marcel Dekker,
2000.
ISBN: 0824790006
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002071619706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Handbook of natural language processing
- Indurkhya, N.; Damerau, F.J. (eds.),
Chapman and Hall/CRC,
2010.
ISBN: 9781420085938
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001234699706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Speech and language processing: an introduction to natural language processing, computational linguistics, and speech recognition
- Jurafsky, D.; Martin, J.H,
Prentice Hall,
2008.
ISBN: 9332518416
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003460299706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
The Oxford handbook of computational linguistics
- Mitkov, R. (ed.),
Oxford University Press,
2003.
ISBN: 0198238827
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002689009706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Foundations of statistical natural language processing
- Manning, C.D.; Schütze, H,
MIT Press,
1999.
ISBN: 0262133601
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001994779706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Linguistic structure prediction
- Smith, N.A,
Morgan & Claypool,
2011.
ISBN: 9781608454051
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991004001819706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Natural language processing with deep learning
- Manning, C.; See, A,
Stanford University,
https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1194/ -
Natural language processing
- Collins, M,
Columbia University,
http://www.cs.columbia.edu/~cs4705/ -
Natural language processing
- Titov, I,
Universiteit van Amsterdam,
http://ivan-titov.org/teaching/nlp1-15/ -
Syntactic analysis in language technology: syntactic parsing
- Stymne, S.; Lhoneux, Miryam de,
Uppsala Universitet,
2017.
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001688389706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
The handbook of computational linguistics and natural language processing
- Clark, A.; Fox, C.; Lappin, S. (eds.),
Wiley-Blackwell,
2010.
ISBN: 9781444324044
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991001686059706711&context=L&vid=34CSUC_UPC:VU1&lang=ca
Web links
- The course website, includes lecture slides, and links to relevant bibliography and resources. http://www.lsi.upc.edu/~ageno/anlp
Previous capacities
- Although not mandatory, familiarity with basic concepts and methods of Natural Language Processing is strongly recommended- Good understanding of basic concepts and methods of Machine Learning.
- Advanced programming skills.