School of engineering in Physics, Applied Physics, Electronics & Materials
Science
> Studies
Introduction to Information Theory set by Claude Shannon in a communication context.
Application to give achievable bounds in terms of data compression (source coding)
and reliable transmission over noisy channel (channel capacity).
1 - General tool for Information Theory : measure of information, entropy, mutual information, Kullback-Leibler divergence, information rate, introduction to
sources with memory and to Markov sources
2 - Source coding : properties, Kraft inequality, first Shannon Theorem, Shannon-Fano and Huffman coding techniques
3 - Capacity and channel coding : capacity, redundancy, second Shannon (fundamental) Theorem, introduction to channel coding.
4 - Information Theory for continuous random variables, channel capacity for Additive White Gaussian Noise channel
Session 1 :
If in-person courses : 2 hours supervised written examination
If distant learning (or examination) mandatory : homework report
Session 2 :
If in-person examination possible : 2 hours supervised written examination
else : distance examination
Session 1 :
If in-person courses and examination possible: 2 hours supervised written examination
else : homework report
Session 2 :
If in-person examination possible : 2 hours supervised written examination
else: distance examination
T.M. Cover, J.A. Thomas, Elements of Information Theory, Wiley & Sons, 2nd edition, 2006.
G. Battail, « Théorie de l'information : application aux techniques de communication », col. pédagogique de Télécommunication, MASSON, 1997
F. Auger, « Introduction à la théorie du signal et de l'information , cours et exercices », Ed. Technip, 1999
Date of update March 18, 2019