Information Theory Course: Learn Entropy, Coding & Channel Capacity | IIT Kanpur
Course Details
| Exam Registration | 125 |
|---|---|
| Course Status | Ongoing |
| Course Type | Elective |
| Language | English |
| Duration | 8 weeks |
| Categories | Electrical, Electronics and Communications Engineering, Communication and Signal Processing |
| Credit Points | 2 |
| Level | Undergraduate/Postgraduate |
| Start Date | 19 Jan 2026 |
| End Date | 13 Mar 2026 |
| Enrollment Ends | 02 Feb 2026 |
| Exam Registration Ends | 16 Feb 2026 |
| Exam Date | 28 Mar 2026 IST |
| NCrF Level | 4.5 — 8.0 |
An Introduction to Information Theory: Unlocking the Science of Communication
Information Theory is the mathematical bedrock of modern communication systems, answering two of the most profound questions in engineering: What is the ultimate limit of data compression? And what is the maximum reliable data rate over a noisy channel? This foundational field, pioneered by Claude Shannon, underpins everything from your smartphone's data connection to high-definition video streaming and deep space communication.
This article provides a detailed overview of an 8-week course on Information Theory, expertly instructed by Prof. Adrish Banerjee from the Indian Institute of Technology (IIT) Kanpur. Designed for undergraduate and postgraduate students, this course offers a rigorous yet accessible journey into the core principles that govern information processing.
Meet Your Instructor: Prof. Adrish Banerjee
Prof. Adrish Banerjee brings a wealth of knowledge and accolades to this course. He earned his B.Tech from IIT Kharagpur and his M.S. and Ph.D. from the University of Notre Dame, USA. Currently the Next Generation Broadcasting Chair Professor in IIT Kanpur's Department of Electrical Engineering, his expertise is widely recognized through awards like the Microsoft Research India Young Faculty Award and the IETE Prof. K. Sreenivasan Memorial Award. His research focuses on wireless communications, green communications, and error control coding, ensuring the course content is both theoretically sound and practically relevant.
Who Should Take This Course?
This course is meticulously structured for:
- 3rd or 4th-year Undergraduate students in Electronics & Communication (EC) streams.
- 1st-year Postgraduate students specializing in Communications and Signal Processing.
Prerequisites: A basic understanding of probability theory and digital communications is recommended to fully grasp the concepts.
Course Objectives & Industry Relevance
The primary goal is to demystify the fundamental limits of communication. You will learn to calculate the maximum rate for error-free data transmission (channel capacity) and the fundamental limits of data compression. The course covers practical algorithms like Huffman and Lempel-Ziv coding, which are ubiquitous in file compression formats (ZIP, GIF, PNG). This knowledge is critical for careers in:
- Telecommunication companies (e.g., designing 5G/6G systems).
- Defense and aerospace laboratories.
- Data storage and multimedia technology firms.
- Any industry involved in efficient data transmission and processing.
Detailed 8-Week Course Layout
The course is divided into a logical progression, building from basic concepts to advanced theorems.
| Week | Topics Covered |
|---|---|
| Week 1 | Introduction: Entropy, Relative Entropy, Mutual Information; Information Inequalities. |
| Week 2 | Block-to-Variable Length Coding: Prefix-free codes, bounds on optimal codelength, Huffman coding. |
| Week 3 | Variable-to-Block Length Coding; Asymptotic Equipartition Property (AEP); Block-to-Block coding of DMS. |
| Week 4 | Universal Source Coding: Lempel-Ziv Algorithms (LZ77 and LZW). |
| Week 5 | Coding for Sources with Memory; Channel Capacity of Discrete Memoryless Channels. |
| Week 6 | Joint Typical Sequences; The Noisy Channel Coding Theorem; Differential Entropy. |
| Week 7 | Gaussian Channel; Parallel Gaussian Channel. |
| Week 8 | Rate Distortion Theory; Blahut-Arimoto Algorithm for capacity and rate-distortion computation. |
Essential Reading & Reference Materials
The course draws from classic and modern texts in the field. Key references include:
- Cover & Thomas: "Elements of Information Theory" - The definitive modern textbook.
- Gallager: "Information Theory and Reliable Communication" - A timeless classic.
- Massey: "Applied Digital Information Theory" - Excellent lecture notes.
- MacKay: "Information Theory, Inference, and Learning Algorithms" - Connects theory to machine learning.
- Csiszar & Korner: "Information Theory" - A rigorous mathematical treatment.
Why Study Information Theory?
Beyond its direct applications in communication engineering, Information Theory provides a powerful framework for understanding randomness, inference, and learning. Concepts like entropy and mutual information are now pivotal in machine learning, data science, quantum computing, and neuroscience. This course by Prof. Banerjee offers not just equations, but a deep conceptual toolkit to analyze and design efficient information processing systems, making it an invaluable investment for any aspiring engineer or researcher in the digital age.
Embark on this 8-week journey to grasp the principles that define the limits of what is possible in communication and data compression, guided by one of India's leading experts in the field.
Enroll Now →