Information theory arises from two important problems: How to represent information compactly, and how to transmit information reliably in the presence of noise. The concept of `entropy' as a measure of information content is central to both of these problems, and is the basis for an elegant theory centred on two famous theorems proved by Claude Shannon over 50 years ago. Only in recent years has the promise of these theorems been fully realized, however. In this course, the elements of information theory will be presented, along with some of the practical techniques needed to construct the data compression and error-correction methods that play a large role in modern communications and storage systems.
Instructor: Radford Neal, Office: SS 6016A, Phone: (416) 978-4970, Email: email@example.com
Office hours: Tuesdays from 2:30pm to 3:30pm and Fridays from 10:10am to 11:00am.
Mondays and Wednesdays, 3:10pm to 4:00pm, from January 5 to April 8, except for Reading Week (February 16-20), in BA 1200.
Fridays, 3:10pm to 4:00pm, from January 16 to April 2, except for Reading Week (February 16-20), in SS 1084 or 1086. We've decided that two tutorial sections are enough. Students in the first half of the alphabet should go to SS 1084, those in the last half to SS 1086. (Or arrange yourselves as needed if one room or the other is crowded.)
David J. C. MacKay, Information Theory, Inference, and Learning Algorithms. See also the author's web site, which includes errata for the book. The copies in the bookstore appear to be from the first printing.
Computing will be done on the CDF computer system. Talk to me if you don't know your account name. If you are not familiar with this system, you should get A Student's Guide to CDF, available from the bookstore.
Final exam, scheduled by the Faculty, worth 48%.
Four assignments, worth 6%, 10%, 6%, and 10%.
A 50-minute in-class test, worth 20%, scheduled for February 27 (in tutorial).
Please read the important course policies
The point of tutorials is to work through examples and supplementary material in an interactive manner. The process of figuring out these examples is as important as the result. You should not expect the tutorial instructor to just `give' you the answers, since this destroys the usefulness of the tutorial for you and the other students.
Nevertheless, although it's not really necessary, some students have asked for my notes on the tutorials. Here are some of these. These will definitely not be posted before the tutorials, since that would undermine the whole purpose!
tut2 tut3 tut4 tut5 tut6 tut7 tut8
No books or notes will be allowed for the final exam. Calculators will be allowed.
Assignment 2: Postscript, PDF.
To do assignment 2, you will need the package of C procedures for arithmetic coding. The package has changed slightly (in Makefile and code.h) from the version put here earlier. Note that this package does arithmetic coding using finite precision integers in a somewhat different (and more efficient) way from the algorithm presented in lecture.
You can get these procedures, plus documentation and example programs using them, as a single Unix tar file, or you can get the files individually by clicking below:
Documentation Makefile code.c decode.c decpic.c encmono.c freq.c testpages HiMom bit-io.c code.h decmono.c encode.c encpic.c freq.hThese files are also on CDF in the directory /u/radford/310.
You will also need the test images, which you can obtain from the same directory on CDF, or you can download them by clicking below:
page1 page2 page3 page4 page5You can view these images in your browser by clicking on the .bmp files below. However, these are not the files you should try to compress, since they have extra header information in them.
page1.bmp page2.bmp page3.bmp page4.bmp page5.bmp
As a check on whether things are working properly for you, here is the output produced by testpages.
Note: A solution to the binary IO problem for Windows due to Cosmin Truta is available here.
Solution: Here is an encode program and a decode program for solving both parts. Here are the commands run for part1, and the output for part 1, and the commands run for part 2, and the output for part 2. Both use the testpages2 script. Finally, here is the discussion.
Assignment 3: Postscript, PDF. Here is the solution, in Postscript, and PDF.
Assignment 4: Postscript, PDF. Here are the parity check matrix and the received data to use for the assignment. Here is the solution: program, output and discussion.
SLIDES IN SLIDES IN RELATED SECTIONS WEEK PDF FORMAT POSTSCRIPT IN THE TEXTBOOK 1 Jan 5 A B A B Chapter 1 2 Jan 12 A B A B Sections 2.1, 4.1, 4.2, 5.1, 5.2 3 Jan 19 A B A B Sections 5.3, 5.4, 5.5, 5.6 4 Jan 26 A B A B Sections 4.3, 4.4, 6.1, 6.2 5 Feb 2 A B A B Sections 2.2, 2.3, 2.6, 3.1, 3.2 6 Feb 9 A B A B 7 Feb 23 A B A B Sections 6.4, 6.5 8 Mar 1 - B - B Chapter 8, Sections 9.1 - 9.4 9 Mar 8 A B A B Sections 9.5 - 9.7 10 Mar 15 A B A B Sections 1.2, 11.4-11.8, 13.1 11 Mar 22 A B A B Chapter 50 12 Mar 29 - B - B Chapter 10 13 Apr 5 A B A B Chapter 47
Spring 2003, by Ken Sevcik
Spring 2002, by Radford Neal