Alan Turing


He deserved much better

National Post
14 Sep 2009

In the very distant future, the name of Alan Turing (1912-1954) will be among the very few for which the 20th century is remembered, long after most of the politicians, artists and celebrities have receded into confusion and oblivion. His stature is…read more…

Advertisements

Authentication – Part III Passwords

Passwords have been the main tool for verifying identity and granting access to computer resources.In general, as FIPS 112 defines it, a password is a sequence of characters that can be used for several authentication purposes.

There are two security problems with a password; 1) somebody other than the legitimate user can guess it; or 2) it can be intercepted (sniffed) during transmission by a third party. Over the years, many different kinds of password generation methods and password protection protocols were designed to address  hese two weaknesses.

Password Strength

The security (strength) of a password is determined by its Shannon entropy , which is a measure of the difficulty in guessing the password. This entropy is measured in Shannon bits. For example, a random 10-letter English text would have an estimated entropy of around 15 Shannon bits, meaning that on average we might have to try \frac{1}{2}(2^{15}) = 2^{14} =16384 possibilities to guess it. In practice, the number of attempts needed would be considerably less because of side information available and redundancy (patterns and lack of randomness).  Since most human users cannot remember long random strings, a major weakness of passwords is that the entropy is usually too small for security. Even if the user can construct long passwords using an algorithm or a fix rule, the same rule may be know or guess by hackers. 

It is well-known to hackers that users commonly select passwords that include variations of the user name, make of the car they drive, name of some family member, etc.  Social engineering is one of the most powerful tools being used by hackers.

Because of limitations in the underlying infrastructure, some authentication systems (notably banks) limit the number of characters and the alphabet from which the characters can be chosen. These kind of limitations are susceptible to dictionary-type attacks. In a dictionary  or brute force attack, the hacker will attempt to gain access using words from a list or dictionary. If the actual password is in the list, it can be obtained (in average)  when about half of the total number of possibilities have been tried. Even with systems that limit the number of trials for a user this is a potential security risk, because the hacker has good chances at gaining access by randomly trying names and passwords until a valid combination is found.

It is commonly accepted that with current tools, up to 30% of the passwords in a system can be recovered within hours. Moreover it is predicted that even random (perfect) passwords of 8 characters will be routinely cracked with technology available to most users by the year 2016.

There are many documents that give rules and policies for good password selection such as NIST Special Publication 800-63 and SANS security Project.

CrypTool  (ver 1.4)  has a very good tool to check the strength of a password against several criteria such as the amount of entropy and the resistance to dictionary attacks.

image

 

Related Posts:

Waiting for the Quantum Leap

For the longest time I have the suspicion that quantum cryptography, although a neat idea, is overrated. I was keeping an eye into developments (see previous posts) just in case. currently my impression is that, with the current, technology, QC is an expensive proposition for the added value it provides. It looks like I am in good company on this. In the October issue of Wired, Bruce Schneier writes a commentary piece where he asserts:

While I like the science of quantum cryptography — my undergraduate degree was in physics — I don’t see any commercial value in it. I don’t believe it solves any security problem that needs solving. I don’t believe that it’s worth paying for, and I can’t imagine anyone but a few technophiles buying and deploying it. Systems that use it don’t magically become unbreakable, because the quantum part doesn’t address the weak points of the system.

Security is a chain; it’s as strong as the weakest link. Mathematical cryptography, as bad as it sometimes is, is the strongest link in most security chains. Our symmetric and public-key algorithms are pretty good, even though they’re not based on much rigorous mathematical theory. The real problems are elsewhere: computer security, network security, user interface and so on.

Moreover I have a nagging question about the fundamental tenet of quantum cryptography. The principle is that Alice and Bob will know for sure that Eve is eavesdropping in their channel because their bits will be changing as required by the uncertainty principle. Eve may be out of luck in getting the secrets as Bob and Alice will certainly decide not to exchange them in her presence. However, the mischievous Eve may decide that she is quite happy with only preventing the exchange. I will call this a denial of channel attack by which Eve can prevent Alice and Bob to exchange any secret until the police figures out where she is tapping the quantum line and force her to stop. Eve-hacker can now start a cat and mouse chase, that judging from the record on netting hackers by the internet police, is lopsided on Eve’s favor.

A mathematical note aside, Schneier mentions in his article the Bennet-Brassard and key reconciliation algorithms used by quantum cryptography. In a paper written with A. Bruen and D. Wehlau we gave rigurous proof of convergence for the Bennet-Bessete-Brassard-Salvail and Smolin (BBBSS92)method. These results and more about quantum cryptography also appear on the our book.

Authentication – Part I, the Achille’s heel.

Most practitioners will agree that state of the art encryption systems (including quantum encryption) provide an adequate level of protection of the information trusted to them. In fact, other than the existence of programming errors in the encryption functions, the only hope an attacker has to gain access to encrypted information, is to fool the authentication measures and impersonate a legitimate user.
The fact that there is no bullet proof authentication system (still an open problem) is indeed the Achille’s heel of modern data security systems.
In a broad sense Authentication has been associated with identification, however a more stringent criterion can be applied if we define it as:

Authentication, the process of verifying that the user has the credentials that authorize him/her to access certain service.

The difference between Identification and Authorization is important and has been analyzed at length by Jim Harper in his book Identity Crisis.

Traditionally, user authentication is based on one or more of the following:

  • Something you know, for example a password or PIN number
  • Something you have, for example, a smart card or an ATM card
  • Something that is physically connected to the user such as biometrics, voice, handwriting, etc.

A fourth factor to be considered is Somebody you know, which has been recently added to the list of factors for electronic authentication, although it has always been a very common form of identification within social networks.

Identifiers

You can hear about this subject at Steve Gibson’s podcast

Information theory goes to Hollywood.

My colleague Aiden Bruen sent me an e-mail comment on a recent news article (“Burning down the house” Globe and Mail, March 8, 08) concerning the newly released movie 21.
He says:

[the article] is somewhat misleading, as is the plot of the movie.
Counting cards in blackjack goes way back to a paper entitled “Fortune’s Formula: a winning strategy for blackjack” presented by mathematician Ed Thorp in January 1961. The paper explained how card-counting improves the odds and how much should be bet. Many backers offered to finance Thorp who went to Reno to try out the system during Spring Break at MIT in 1961. The system worked perfectly. Eventually however, Thorp and his associates would be asked to leave. Most casinos then also adopted the “professor stopper” which allowed dealers to shuffle multiple decks together, thereby sharply reducing the edge afforded by “card counting”. Details are nicely described in the book by W. Poundstone.
The mathematics is based on the work of the great Claude Shannon on information theory [see our book]. The ideas, fundamental in communications, are still used prominently in finance and gambling using the so-called Kelly formula.

A quick search on the internet turn out this other article (“Getting a hand”) and many references to the Four Horseman who were pioneers in devising an optimal strategy for beating Blackjack. Their insights were later formalized and corrected by Edward Thorp in his book “Beat the Dealer”. John Kelly made an important contribution to the information theoretical aspects of the optimal betting strategy problem.

One Time Pads

There is certain fascination with the One Time Pad (or Vernam cipher) among people interested in cryptography.
One of the reasons is the famous fact that it is the only provably secure cipher. Shannon`s theoretical insight cemented the fame of the OTP as the only truly unbreakable cipher. For those that already don`t know how it works and why it is unbreakable the following links will give a very good intro:

Dirk`s Rijmenants website, “A one-time pad isn’t a cryptosystem:” it’s a state of mind and of course the entry at Wikipedia

Another reason is the spy-vs-spy aspect of the OTP. The NSA’s VENONA pages abound in details about the successful deciphering of many KGB documents enciphered with OTPs during 1942 and 1980. The KGB’s cryptographic material manufacturing center apparently reused some of the pages from one-time pads, breaking one of the fundamental rules of the OTPs, use the random material only once.

Keep those old files rolling

Looking at my collection of 3.5″ floppy disks, carefully backed up when DOS was still around, I wondered how much work would take me to get to the useful information they may contain.
There is a fundamental problem with the longevity of digital media and to solve it would cost us many dollars.
This article by Julian Jackson (read the whole thing) summarizes the main issues around digital media.

In the headlong rush to put photographic images into digital form, little thought has been given to the problem of the longevity of digital files. There is an assumption that they will be lasting, but that is under question.

“There is growing realisation that this investment and future access to digital resources, are threatened by technology obsolescence and to a lesser degree by the fragility of digital media. The rate of change in computing technologies is such that information can be rendered inaccessible within a decade. Preservation is therefore a more immediate issue for digital than for traditional resources. Digital resources will not survive or remain accessible by accident: pro-active preservation is needed.” Joint Information Systems Committee: Why Digital Preservation?

The 1086 Domesday Book, instigated by William the Conqueror, is still intact and available to be read by qualified researchers in the Public Record Office. In 1986 the BBC created a new Domesday Book about the state of the nation, costing £2.5 million. It is now unreadable. It contained 25,000 maps, 50,000 pictures, 60 minutes of footage, and millions of words, but it was made on special disks which could only be read in the BBC micro computer. There are only a few of these left in existence, and most of them don’t work. This Domesday Book Mark 2 lasted less than 16 years.

For most of us the solution is to keep moving the files as we upgrade software and hardware. It would have been nice to have a plan and the time to do it.

Entropy, the Shannon Bit and Perfect Secrecy

The unit of information entropy is the Shannon Bit or S-bit. An S-bit is the amount of uncertainty that an observer has before receiving the answer to a YES or NO question when all that is known, a-priori, is that each of the two answers is equally likely.

Alternatively we can say that an S-bit is the amount of information gained after learning the answer to a YES or NO question when all that is known, a priori, is that each of the two answers is equally likely.

We can combine these two statements as follows:

A Shannon bit (S-bit) is the amount of information gained (or entropy removed) upon learning the answer to a question whose two possible answers were equally likely a priori.

Think of the surprise of an event happening as being inversely related to the probability of that event happening. Thus, learning that a high-probability event has taken place is much less of a surprise -and gives less information- than learning that a low-probability event has taken place.

It is tempting to think of the uncertainty as being the inverse of the probability. For functional compatibility reasons, the rigurous mathematical definition of entropy uses the logarithm of the probability:

UNCERTAINTY = LOG OF THE INVERSE OF PROBABILITY

and

ENTROPY = AVERAGE UNCERTAINTY

The maximal entropy is reached when the probability of guessing the next bit in a series is minimal, that is, there is a 50-50 chance to guess correctly. We can say that the information the next bit adds is 1 S-bit.
If on the contrary when one answer to a question -say getting a one- is more likely than the other -getting a 0-, learning the answer conveys less information than one S-bit.
Note that if we transmit independently n bits the entropy becomes n times the entropy in S-bits.
In a cryptographic context, suppose A is sending a message m to B in the form of a binary string. Then, the bigger the entropy of the message m the more guesses required by an attacker to guess m.

HIGH ENTROPY = GOOD SECURITY

Using the idea of entropy, it is now very easy to make precise the concept of unconditional security (perfect secrecy). In order to communicate with B, A enciphers m and transmits the cipher-text c to B using the key k.

The cipher-text c is said to provide perfect secrecy if the entropy of the message m, given c, equals the entropy of m.

In other words, the entropy of the message is not diminished by knowledge of the cipher-text.

Unconditional security means that m and c are statistically independent. In other words, the probability of getting the message before the cipher-text has been intercepted equals the probability of getting the message after the cipher-text has been intercepted. So, the cipher-text gives no extra information about the message.

For unconditional security an adversary E (Eve), who intercepts the cipher c, obtains no information about the message that E did not have at the outset. No extra information on m is obtained by intercepting the cipher c. Thus if the message m is completely random at the beginning (as far as E is concerned) m will remain random even if E intercepts the cipher-text.