Uniquely Decodable Code

A code for is uniquely decodable if no two strings from have the same codeword. That is, for all ,

e.g. is not uniquely decodable since .

Proposition

Uniform lossless codes are uniquely decodable; Uniquely decodable codes are lossless.

Proof Let be a uniform lossless code. Then is injective. So for all such that , there is some , that the th characters . By injectivity of , . Since is uniform, , is uniquely decodable.

Prefix Codeword

A codeword is said to be a prefix of another codeword if there exists a string such that .

e.g. has prefix .

Prefix Free Code

A code with range is a prefix (free) code if no codeword is a prefix of another codeword. That is, for all there is no prefix of in .

e.g. is a prefix code.

Proposition

Prefix codes are uniquely decodable, but not vice versa.

e.g. The code is uniquely decodable but not a prefix code.

The Kraft-McMillan Inequality

For any uniquely decodable binary code , its codeword length satisfy Conversely, if the set satisfy the above inequality, then there exists a uniquely decodable code with codeword lengths .

e.g. is prefix and .

Expected Code Length and Shannon Coding

Expected Code Length

The expected length for a code of an ensemble is given by where is the length of the codeword for .

Probabilities From Code Lengths

Given a code for with lengths , such that , we define , the probability vector for , by

The goal of compression is to answer the question that, given an ensemble with probabilities how can we minimise the expected code length?

Limits of Compression

Given an ensemble with probabilities , and prefix code with codeword length probabilities and normalisation . Then with equality only when . Here is the KL divergence.

Proof

Shannon Code

Given a finite ensemble with probabilities . A code is called a Shannon code for if it has code lengths such that

e.g. If , then with is a Shannon code.

Shannon Coding Theorem for Symbol Codes

For any ensemble , there exists a prefix code such that In particular, any Shannon code has expected code length within bit of the entropy. Proof Clearly for any real we have . Therefore, if we create a Shannon code for some ensemble with , it satisfies Thus Since we have , it follows that , so there is a prefix code with length by the Kraft-McMillan inequality.

Remark

The Shannon coding theorem for symbol codes is just exactly the source coding theorem for symbol codes.

Proposition

Shannon codes do not necessarily have smallest expected length.

Proof Consider and . Notice that . Then the Shannon code has and so the expected length is . However, the code with and has expected length .