Entropy, the Shannon Bit and Perfect Secrecy

The unit of information entropy is the Shannon Bit or S-bit. An S-bit is the amount of uncertainty that an observer has before receiving the answer to a YES or NO question when all that is known, a-priori, is that each of the two answers is equally likely.

Alternatively we can say that an S-bit is the amount of information gained after learning the answer to a YES or NO question when all that is known, a priori, is that each of the two answers is equally likely.

We can combine these two statements as follows:

A Shannon bit (S-bit) is the amount of information gained (or entropy removed) upon learning the answer to a question whose two possible answers were equally likely a priori.

Think of the surprise of an event happening as being inversely related to the probability of that event happening. Thus, learning that a high-probability event has taken place is much less of a surprise -and gives less information- than learning that a low-probability event has taken place.

It is tempting to think of the uncertainty as being the inverse of the probability. For functional compatibility reasons, the rigurous mathematical definition of entropy uses the logarithm of the probability:

UNCERTAINTY = LOG OF THE INVERSE OF PROBABILITY

and

ENTROPY = AVERAGE UNCERTAINTY

The maximal entropy is reached when the probability of guessing the next bit in a series is minimal, that is, there is a 50-50 chance to guess correctly. We can say that the information the next bit adds is 1 S-bit.
If on the contrary when one answer to a question -say getting a one- is more likely than the other -getting a 0-, learning the answer conveys less information than one S-bit.
Note that if we transmit independently n bits the entropy becomes n times the entropy in S-bits.
In a cryptographic context, suppose A is sending a message m to B in the form of a binary string. Then, the bigger the entropy of the message m the more guesses required by an attacker to guess m.

HIGH ENTROPY = GOOD SECURITY

Using the idea of entropy, it is now very easy to make precise the concept of unconditional security (perfect secrecy). In order to communicate with B, A enciphers m and transmits the cipher-text c to B using the key k.

The cipher-text c is said to provide perfect secrecy if the entropy of the message m, given c, equals the entropy of m.

In other words, the entropy of the message is not diminished by knowledge of the cipher-text.

Unconditional security means that m and c are statistically independent. In other words, the probability of getting the message before the cipher-text has been intercepted equals the probability of getting the message after the cipher-text has been intercepted. So, the cipher-text gives no extra information about the message.

For unconditional security an adversary E (Eve), who intercepts the cipher c, obtains no information about the message that E did not have at the outset. No extra information on m is obtained by intercepting the cipher c. Thus if the message m is completely random at the beginning (as far as E is concerned) m will remain random even if E intercepts the cipher-text.

Advertisements

2 Responses to Entropy, the Shannon Bit and Perfect Secrecy

  1. Pingback: Authentication - Part III Passwords « CyptoBlog

  2. Pingback: Kryptologi: Shift Cipher Script - Min hverdag

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: