University of Virginia, Department of Computer Science
CS588: Cryptography, Spring 2005

Manifest: Thursday 25 January 2005
Assignments Due

Optional readings:

Notes
Entropy: H(M) = Σ Prob [Mi] log2 ([1 / Prob [Mi])
(Sum over all messages Mi.)

Entropy is a measure of information. The amount of new information a message conveys, depends on how surprising it is.

If M is a source that produces n messages that are all equally probable, what is H(M)?




Absolute Rate: R = how much information can be encoded

Rate: r = H(M) / N
    M is the source of N-letter messages

How many meaningful 20-letter messages are there in English?




Redundancy: D = R - r

Unicity Distance: U = H(K) / D

Unicity distance is the expected minimum amount of ciphertext needed for a brute-force attack to succeed. If you have less than this, cannot determine if a particular guess is correct.

Questions
Information theory has perhaps ballooned to an importance beyond its actual accomplishments. — Claude Shannon

CS 655 University of Virginia
Department of Computer Science
CS 588: Cryptology - Principles and Applications
cs588–staff@cs.virginia.edu