Information Theory

Lecturer (assistant)
Duration5 SWS
TermWintersemester 2020/21
Language of instructionEnglish
Position within curriculaSee TUMonline
DatesSee TUMonline


Admission information


Review of probability theory. Information theory for discrete and continuous variables: entropy, informational divergence, mutual information, inequalities. Coding of memoryless sources: rooted trees with probabilities, Kraft inequality, entropy bounds on source coding, Huffman codes, Tunstall codes. Coding of stationary sources: entropy bounds, Elias code for the positive integers, Elias-Willems universal source coding, hidden finite-memory sources. Channel coding: memoryless channels, block and bit error probability, random coding, converse, binary symmetric channel, binary erasure channel, symmetric channels, real and complex AWGN channels, parallel and vector AWGN channels, source and channel coding.