CS588: Cryptography, Spring 2005
Manifest: Thursday 25 January 2005
- Claude Shannon, Theory of Secrecy Systems: Part 1, Part 2. Bell System Technical Journal, October 1949.
NotesEntropy: H(M) = Σ Prob [Mi] log2 ([1 / Prob [Mi])
(Sum over all messages Mi.)
Entropy is a measure of information. The amount of new information a message conveys, depends on how surprising it is.
If M is a source that produces n messages that are all equally probable, what is H(M)?
Absolute Rate: R = how much information can be encoded
Rate: r = H(M) / N
M is the source of N-letter messages
How many meaningful 20-letter messages are there in English?
Redundancy: D = R - r
Unicity Distance: U = H(K) / D
Unicity distance is the expected minimum amount of ciphertext needed for a brute-force attack to succeed. If you have less than this, cannot determine if a particular guess is correct.
Information theory has perhaps ballooned to an importance beyond its actual accomplishments. — Claude Shannon
- Is there any source M' that produces n messages that has higher entropy than M (the source for which all n messages are equally likely described above)?
- Is Shakespearean English more or less redundant that your email?
- How could you determine the rate of English?
- What is the unicity distance of a one-time pad?
- Why is there no cipher that is both information-theoretically perfect and practical?
University of Virginia
Department of Computer Science
CS 588: Cryptology - Principles and Applications