Credits
6
Types
Compulsory
Requirements
This subject has not requirements
, but it has got previous capacities
Department
MAT;TSC;FIS
The course has two main objectives: (1) give the students a rigorous introduction to the main points of Information Theory, including proofs of the two fundamental theorems of noiseless-source coding and noisy-channel coding; (2) present several applications, including data compression and error correction codes, but also some others as inference or cryptography.
Teachers
Person in charge
- Adrián Francisco Tauste Campo ( adria.tauste@upc.edu )
Others
- Josep Vidal Manzano ( josep.vidal@upc.edu )
Weekly hours
Theory
2
Problems
2
Laboratory
0
Guided learning
0
Autonomous learning
6
Competences
Technical competencies
Transversals
Basic
Generic
Objectives
Contents
-
Discrete random variables and processes
Probability, ensembles of random variables, stochastic processes, Márkov processes -
Measures of information
Information theory, entropy, joint entropy and mutual information, data processing inequality, Fano's inequality, applications -
Information of data sources
Codes, asymptotic equipartition property, data compression, the high probability set, non-independent sources -
Source coding
Properties of codes, unique decodability, mínimum average length, Huffman codes, dictionary codes -
Capacity of discrete channels
Joint typical sequences, channel capacity theorem, separability of source and channel coding. -
Channel codes
Introduction to error correction codes, block codes -
Cryptography
Shannon theory of secrecy systems; main theorem; one-time pad; symmetric cryptography in practice -
Estimation of information measures
Estimation methods of entropy and mutual information from data
Activities
Activity Evaluation act
Theory
2h
Problems
2h
Laboratory
0h
Guided learning
0h
Autonomous learning
4h
Theory
5h
Problems
5h
Laboratory
0h
Guided learning
0h
Autonomous learning
8h
Theory
5h
Problems
5h
Laboratory
0h
Guided learning
0h
Autonomous learning
4h
Theory
6h
Problems
6h
Laboratory
0h
Guided learning
0h
Autonomous learning
8h
Theory
3h
Problems
4h
Laboratory
0h
Guided learning
0h
Autonomous learning
4h
Theory
6h
Problems
9h
Laboratory
0h
Guided learning
0h
Autonomous learning
4h
Theory
3h
Problems
0h
Laboratory
0h
Guided learning
0h
Autonomous learning
4h
Theory
2h
Problems
2h
Laboratory
0h
Guided learning
0h
Autonomous learning
0h
Teaching methodology
50% of lectures in which the participation of students is stimulated, followed by 50% of practical classes based on exercises and programming of algorithms with the aim of bringing information theory to practical applications related to data science engineering.Evaluation methodology
There will be a mid-term test of two hours duration at the 8th week and a final exam. The grade is calculated as the maximum of (final exam grade, 0.6 * final exam grade + 0.4 * mid-term exam grade).The re-evaluation exam, for fails who have attended lectures and final exam, will consist of one exam to be held in July and that will be considered at 100% for the final grading.
Bibliography
Basic
-
Elements of information theory
- Cover, T.M.; Thomas, J.A,
John Wiley & Sons,
2006.
ISBN: 0471241954
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991003402919706711&context=L&vid=34CSUC_UPC:VU1&lang=ca -
Information and communication theory
- Höst, S,
Wiley IEEE Press,
2019.
ISBN: 9781119433781
http://cataleg.upc.edu/record=99100491894860671~S1*cat -
Information theory, inference, and learning algorithms
- Mackay, D.J.C,
Cambridge University Press,
2003.
ISBN: 9780521642989
https://discovery.upc.edu/discovery/fulldisplay?docid=alma991002876809706711&context=L&vid=34CSUC_UPC:VU1&lang=ca