CS588: Cryptography, Spring 2005

Manifest: Thursday 25 January 2005
Assignments Due
 Now: CS588 Pledge
 Thursday, 3 February: Problem Set 1
Optional readings:
 Claude Shannon, Theory of Secrecy Systems: Part 1, Part 2. Bell System Technical Journal, October 1949.
Notes Entropy: H(M) = Σ Prob [M_{i}] log_{2} ([1 / Prob [M_{i}])
(Sum over all messages M_{i}.)Entropy is a measure of information. The amount of new information a message conveys, depends on how surprising it is.
If M is a source that produces n messages that are all equally probable, what is H(M)?
Absolute Rate: R = how much information can be encoded
Rate: r = H(M) / N
M is the source of Nletter messages
How many meaningful 20letter messages are there in English?
Redundancy: D = R  r
Unicity Distance: U = H(K) / D
Unicity distance is the expected minimum amount of ciphertext needed for a bruteforce attack to succeed. If you have less than this, cannot determine if a particular guess is correct.
Questions
Information theory has perhaps ballooned to an importance beyond its actual accomplishments. — Claude Shannon
 Is there any source M' that produces n messages that has higher entropy than M (the source for which all n messages are equally likely described above)?
 Is Shakespearean English more or less redundant that your email?
 How could you determine the rate of English?
 What is the unicity distance of a onetime pad?
 Why is there no cipher that is both informationtheoretically perfect and practical?
University of Virginia Department of Computer Science CS 588: Cryptology  Principles and Applications 
cs588–staff@cs.virginia.edu 