English Türkçe için tıklayın
Chairman's Welcome Message Information
 Lisans Öğrencileri Lisansüstü Adayları
EE671

Title:  INFORMATION THEORY


Credits: 3


Catalog Description: EE 671 Information Theory                    (3+0+0)3


Measuring information and randomness. Basic information theoretic quantities, such as, entropy, relative entropy, mutual information, together with information theory algebra. Fundamental information theoretic inequalities. AEP (asymptotic equiparition property). Lossless source coding and entropy bound. Joint AEP and channel coding; achievability and converse proofs. Differential entropy. Gaussian channels. Rate distiortion theory and lossy source coding.

Prerequisites: EE 577 or consent of instructure.


Coordinator: M. Kıvanç Mıhçak, Associate Professor of Electrical Engineering


Goals:
This course treats fundamental concepts in information theory and how they naturally arise in fundamental questions concerning the limits of communication and signal processing systems. The goal is to be able to carry out analysis of several communications and signal processing systems of interest to quantify fundamental limits in source and channel coding senses.

Learning Objectives:


At the end of this course, students will be able to:

 

  • Learn basics of information theoretic primitives, mathematical definitions and corresponding practical meaning
  • Analyze basic source and channel coding systems to characterize their fundamental limits and therefore analytically find out how much we can compress a source or how many bits per sample we can transmit through a channel.
  • Prove fundamental limits of information coding and transmission systems.


Textbook:
T. Cover and J. Thomas, “Elements of Information Theory”, Wiley   


Reference Texts:

 

  • R. G. Gallager, “Information Theory and Reliable Communication”, Wiley, 1968.
  • R. Ash, “Information Theory”, Wiley, 1965.
  • C. E. Shannon, “A Mathematical Theory of Communication”, the Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October, 1948. (will be distributed via e-mail)


Prerequisites by Topic:

 

  • Probability and random processes
  • Strong mathematical background and analytical understanding


Topics:

 

  • Entropy, relative entropy, mutual information, data processing inequality, Markov chains, fundamental mathematical primitives in information theory.
  • Asymptotic equipartition theory and concept of typicality.
  • Entropy rates of stochastic processes.
  • Lossless data compression (Huffman, Lempel-Ziv, Arithmetic, Shannon-Fano codes): Kraft’s inequality, Shannon’s source coding theorem.
  • Channel capacity, jointly-typical sequences, Fano’s inequality, Shannon’s channel coding theorem and its converse.
  • Differential entropy.
  • Gaussian channels.
  • Rate-distortion theory: Shannon’s source coding theorem relative to a fidelity criterion.
  • Other topics in information theory (if time permits): Multiple Access Channels (MACs), Multiple-Input-Multiple-Output (MIMO) Channels, Broadcast Channels, Relay Channels, Basic Concepts in Network Information Theory. 


Course Structure: The class meets for three lectures a week (with some extra lectures throughout the semester), consisting of three 50-minute sessions. 5-6 sets of homework problems are assigned per semester.  There is one in-class mid-term exam, and a final exam, which will both be open-ended. 

  
Computer Resources: Almost none; in a couple of homework questions, students will need to use Matlab.


Laboratory Resources:
None.


Grading:

 

  • Homework sets (25%)
  • Midterm (35%)
  • Final (40%)


Outcome Coverage:


(a)    Apply probability theory. This course requires deep and profound understanding of probability theory, which leads to law of large numbers and AEP, which constitute the heart of information theoretic principles.
(b)    Carry out proofs and mathematical analysis. All the concepts throughout this course will go proof-based; as such, students will need to be able to feel comfortable with such an approach.
(c)    Quantify and analyze fundamental limits of information transmission and compression. The fundamental task in information theory is to quantify asymptotically optimal fundamental rates of source coding (compression) and channel coding (information transmission). Although such approaches provide an understanding of the system at hand, they do not necessarily lead to practical algorithms.


Prepared By: M. Kıvanç Mıhçak
 

 

Boğaziçi Üniversitesi - Elektrik ve Elektronik Mühendisliği Bölümü

34342 - Bebek / İSTANBUL

Tel: +90 212 359 64 14
Fax: +90 212 287 24 65

 

 

 

 

 

 

Copyright 2009 by Bogazici University EE Dept.    |   Privacy Statement   |   Terms Of Use     |     Login