UNIVERSITY OF MINNESOTA 
SCHOOL OF MATHEMATICS

Math 5251: Error-Correcting Codes and Finite Fields

Spring 2007

Prerequisites: Single-variable calculus, and a solid background in linear algebra.
Some familiarity with modular arithmetic might help, but is not required.
We will eventually understanding something about finite fields, their structure and matrices/linear algebra over them.
For this reason, the course has a slightly higher mathematical level than Math 5248.
Instructor: Victor Reiner (You can call me "Vic"). 
Office: Vincent Hall 256
Telephone (with voice mail): (612) 625-6682
E-mail: reiner@math.umn.edu 
Classes: Monday, Wednesday 3:35-5:00pm
in Ford Hall 110
Office hours: Mon, Tues, Fri at 2:30pm, and by appointment.  
Course content: This is an introductory course in the mathematics of codes for communication
designed to achieve compression of information and error-detecting/correction.
We intend to cover much of the text by Garrett listed below, including treatment of
  • Elementary information theory and entropy
  • Simple compression schemes and noiseless coding: Kraft-McMillan inequality Shannon's noiseless coding theorem, Huffman coding
  • Error-detection/correction and Shannon's noisy coding theorem
  • Error-correcting codes, with an emphasis on linear codes, parity check matrices, syndrome decoding
  • Bounds on efficiency of error-correcting codes: Hamming, Singleton, Plotkin, Gilbert-Varshamov
  • Finite fields and their structure
  • Cyclic linear codes, such as Hamming, Reed-Solomon, BCH codes.
  • A few other codes, e.g. Golay, Reed-Muller codes.
Please note that we will not discuss the error-correcting Goppa codes which come from algebraic curves,
even though they are sometimes included in the title of the course.
The topics covered are similar to when the course has been taught in the past by Prof. Paul Garrett, the author of our text
A useful resource are his lecture transparencies and homeworks solutions from his crypto page.
Here is my syllabus from last spring, when I first taught this class.
This is not a course in
  • codes designed for secrecy (see Math 5248 Cryptology and number theory)
  • compression via wavelets (see Math 5467 Introduction to the Mathematics of wavelets)
  • abstract algebra of fields at a more theoretical level (see Math 5286H Fundamentals of abstract algebra)
  • the details of serious engineering applications/implementation
    (see EE5501 Digital communication, EE5581 Information theory and coding, EE5585 Data compression)
Text: The Mathematics of Coding Theory: Information, Compression, Error Correction and Finite Fields by Paul Garrett, Prentice Hall, 2004.
A (non-required) supplemental text which has been used for part of this course in the past is Introduction to coding and information theory by Steven Roman, Springer-Verlag, 1997. There are a small number of topics we are likely to touch on that appear in Roman's book, but not in Garrett's: non-binary Huffman coding, Plotkin bound, Reed-Muller codes, Golay codes, latin squares.


Here is Richard Ehrenborg's parlor trick using the Hamming binary [7,4,1]-code
that was demonstrated on the first day of class.
Homework and exams: There will likely be 6 homework assignments due generally every other week, except for
  • the 6th homework will be only one week (see the schedule below),
  • 2 weeks where there will be a week-long take-home midterm exam,
  • a week at the end with a week-long take-home final exam.
Dates for the assignments and exams are in the schedule below, to give you an idea of what will happen. The take-home midterms and final exam are open-book, open-library, open-web, but no collaboration or consultation of human sources is allowed.

Late homework will not be accepted. Early homework is fine, and can be left in my mailbox in the School of Math mailroom near Vincent Hall 105. Collaboration is encouraged as long as everyone collaborating understands thoroughly the solution, and you write up the solution in your own words, along with a note at the top of the homework indicating with whom you've collaborated.

Homework solutions should be well-explained-- the grader will be told not to give credit for an unsupported answer.

Grading:  Homework = 50% of grade
Each of 2 midterms = 15% of grade
Final exam 20% of grade.

Complaints about the grading should be brought to me.

Policy on incompletes:  Incompletes will be given only in exceptional circumstances, where the student has completed almost the entire course with a passing grade, but something unexpected happens to prevent completion of the course. Incompletes will never be made up by taking the course again later. You must talk to me before the final exam if you think an incomplete may be warranted.  
Other expectations  This is a 4-credit course, so I would guess that the average student should spend about 8 hours per week outside of class to get a decent grade. Part of this time each week would be well-spent making a first pass through the material in the book that we anticipate to cover in class that week, so that you can bring your questions/confusions to class and ask about them.
Homework/exam schedule and assignments
Assignment or Exam Due date Problems, mainly from Garrett's book
Homework 1 Wed Jan. 31 From Garrett's text:
1.28, 1.31, 1.33
2.03 (Note: I moved problem 2.04 to HW2)
3.02,3.05

Not from text:
A. Consider these three collections C1, C2, C3 of codewords:
C1={0,10,110,1110,1111}
C2={0,10,110,1110,1101}
C3={0,01,011,0111,1111}
Indicate for each (with explanation) whether or not it is
(a) uniquely decipherable,
(b) instantaneous.

B. Does there exist a binary code which is instantaneous and has code words with lengths (1,2,3,3)? If not, prove it. If so, construct one.

(Note: I am removing this problem from the HW's:
C. State and prove the precise conditions under which a random variable X on a finite probability space has entropy H(X) equal to zero.)
Homework 2 Wed Feb. 14 From Garrett's text:
2.04
4.01, 4.02, 4.04, 4.05, 4.06, 4.11

Not from text:
Let p be a probability between 0 and 1.
Explain why a noisy channel with input and output alphabets both {0,1} and the following probabilities is called a useless channel:
P(0 received | 0 sent)=p
P(0 received | 1 sent)=p
P(1 received | 0 sent)=1-p
P(1 received | 1 sent)=1-p
Midterm exam 1 Wed. Feb. 21 Midterm exam 1 in PostScript, PDF.
Homework 3 Wed Mar. 7 From Garrett's text:
5.01, 02, 03, 04, 05, 08
6.01, 03, 07, 22, 49, 50, 52, 80 (Prob. 6.37 is moved to HW4)
Homework 4 Wed Mar. 28 From Garrett's text:
6.30, 31, 37, 57, 81
8.17
9.11, 12
10.04, 08, 11
11.11
12.06, 12.10, 12.12, 12.14, 12.15
Midterm exam 2 Wed. Apr. 4 Midterm exam 2 in PostScript, PDF.
Homework 5 Wed Apr. 18 From Garrett's text:
12.01, 02, 04, 17, 19, 20
13.02, 05, 07, 09, 10
Homework 6 Wed Apr. 25 (note 1-week due date!) From Garrett's text:
14.02, 05
11.04, 05
15.03, 13
Final exam Wed. May 2 Final exam in PostScript, PDF.

Back to Reiner's Homepage.