Page on authentication added
February 22, 2011 Leave a comment
I’ve added a page on authentication that sums up several posts and put them in one place.
Check it out on the right side under “Look Inside”
Cryptography, Information Theory and Codes
February 22, 2011 Leave a comment
I’ve added a page on authentication that sums up several posts and put them in one place.
Check it out on the right side under “Look Inside”
February 9, 2010 Leave a comment
My prediction is that we are going to see more and more of these privacy commissioner orders as the guys in charge get more serious about not being sued.
December 31, 2009 Leave a comment
Although it has been known for a few years, the weakness of encryption schemes for GSM phones is in the spotlight again. This time thanks to a group of hackers that made the whole business of listening in, easy and cheap.
GSM has been known to be hackable for years, but the problem is not being fixed as proactively as it should.
Could be drawn with the situation of the Enigma machines being sold around the world after WWII?
November 16, 2009 Leave a comment
An “Authentication Gap” was discovered in the latest version of SSL/TLS protocol.This could potentially be a huge problem. The gap is not due to some erroneous implementation, it is a property of the protocol.
Here is a list of links to websites where the issue is being followed:
August 22, 2009 Leave a comment
An article stating the need to protect biometric data appeared in the IEEE spectrum magazine. Not a lot of new information, a good summary of the threats as biometrics are being used more and more as authenticators.
March 26, 2009 Leave a comment
ENIGMA crackers reunite at Bletchley Park
I had the honour to meet one of them, now an emeritus math professor.
Check this article for pictures of the Turing Bombe the electronic-mechanical code-breaking machine used by the British to crack 3,000 Enigma messages a day during the Second World War.
Cryptool ver 1.4 has a very well done simulator of the ENIGMA machine encryption.
del.icio.us Tags: ENIGMA,encryption,Cryptoblog,cryptography,Information Theory,cracking,in the news
February 21, 2009 2 Comments
The trouble with the use of MD5 in digital signatures recently uncovered by Sotirov et al. is common to other hash functions.
NIST has been discouraging people to use MD5 and even SHA 1 since many years ago. A good account of this was posted by Dustin Trammell here.
Because the output of a hash function is of a fixed length, usually smaller that the input, there will necessarily be collisions. The collision-free property for hash is thus defined by:
A function that maps an arbitrary length message
to a fixed length message digest
is a collision-free hash function if:
1. It is a one-way hash function.
2. It is hard to find two distinct messages that hash to the same result
.
Cryptographers talk about “relatively collision free” hash functions. A good hash function should be designed with the Avalanche Criterion in mind.
The Avalanche Criterion (AC) is used in the analysis of S-boxes or substitution boxes. S-boxes take a string as input and produce an encoded string as output.
The avalanche criterion requires that if any one bit of the input to an S-box is changed, about half of the bits that are output by the S-box should change their values. Therefore, even if collisions are unavoidable, there is no way to generate two strings with the same hash value other than brute force.
December 5, 2008 Leave a comment
The latest stable version of cryptool is 1.4.21. I highly recommend to download both 😉
November 14, 2008 Leave a comment
For the longest time I have the suspicion that quantum cryptography, although a neat idea, is overrated. I was keeping an eye into developments (see previous posts) just in case. currently my impression is that, with the current, technology, QC is an expensive proposition for the added value it provides. It looks like I am in good company on this. In the October issue of Wired, Bruce Schneier writes a commentary piece where he asserts:
While I like the science of quantum cryptography — my undergraduate degree was in physics — I don’t see any commercial value in it. I don’t believe it solves any security problem that needs solving. I don’t believe that it’s worth paying for, and I can’t imagine anyone but a few technophiles buying and deploying it. Systems that use it don’t magically become unbreakable, because the quantum part doesn’t address the weak points of the system.
Security is a chain; it’s as strong as the weakest link. Mathematical cryptography, as bad as it sometimes is, is the strongest link in most security chains. Our symmetric and public-key algorithms are pretty good, even though they’re not based on much rigorous mathematical theory. The real problems are elsewhere: computer security, network security, user interface and so on.
Moreover I have a nagging question about the fundamental tenet of quantum cryptography. The principle is that Alice and Bob will know for sure that Eve is eavesdropping in their channel because their bits will be changing as required by the uncertainty principle. Eve may be out of luck in getting the secrets as Bob and Alice will certainly decide not to exchange them in her presence. However, the mischievous Eve may decide that she is quite happy with only preventing the exchange. I will call this a denial of channel attack by which Eve can prevent Alice and Bob to exchange any secret until the police figures out where she is tapping the quantum line and force her to stop. Eve-hacker can now start a cat and mouse chase, that judging from the record on netting hackers by the internet police, is lopsided on Eve’s favor.
A mathematical note aside, Schneier mentions in his article the Bennet-Brassard and key reconciliation algorithms used by quantum cryptography. In a paper written with A. Bruen and D. Wehlau we gave rigurous proof of convergence for the Bennet-Bessete-Brassard-Salvail and Smolin (BBBSS92)method. These results and more about quantum cryptography also appear on the our book.
November 13, 2008 1 Comment
A total stranger, talking from behind a screen, promises you that he/she/it (you don’t have ways to know) will perform some service after you reveal sensitive private information about yourself and your financial situation by shouting it in a public place.
Strangely enough, you do as he/she/it says because there is the underlying promise that everything will be perfectly fine, as demonstrated by the millions of daily similar transactions being done by millions of people all over the world. Moreover, everybody agrees that transaction as the one described are the minimum standard of performance required from any protocol for authentication and encryption over public channels such as the internet that pretend to have any chance at success.
When we look at the problem from this angle, we should ask not why there are security breaches, instead how is possible that there are any successful (meaning secure) transactions at all.
The problem of exchanging secure messages over a public channel is relatively easy to solve, since the 1960’s we count with encryption algorithms that gives us a relatively comfortable advantage in the race between coders and hackers. These algorithms are the result of interesting mathematical discoveries. However mathematics discoveries alone cannot prevent people from seeking some advantages through cheating and lying. The Internet was born as a very naive environment in which everybody was trusting and trustworthy, but as soon as valuable information started to be exchanged, mechanisms to avoid misrepresentation needed to be put in place.
This is well known, absolutely secure communications can be had over a public channel provided that there was at some point in time a secure exchange between the parties over a private channel. Thus if I want to communicate with my lawyer with absolute security, I will meet him at his office and exchange a set of encryption keys that we will keep confidential (is in both parties best interest). Next time we need to communicate, we will encrypt the messages using this set of keys, which also ensures the identity of the other party.
Enter W. Diffie and M. Hellman, who develop a secure and very elegant way to exchange a piece of secret information (say an encryption key) over a public channel in which the resident evil eavesdropper (Eve) cannot guess the secret number unless she knows how to solve the discrete logarithm problem (an open problem in mathematics ).
Without a strong authentication however, Eve can succeed by intercepting the communications and impersonate my lawyer to me and myself to my lawyer. This way Eve will end up with two secret keys, one to communicate with me and the other to communicate with my lawyer, and the power to snoop in the conversation or even tamper with it, without raising suspicions (the famous man in the middle attack).
The man in the middle has an upside though, in the case in which Eve is a benign entity that is trusted by the participants in the communication (which do not need to know nor trust each other) she can serve as the Trusted Server and provide authentication and encryption keys to both parties. This exchange can be done in such a way as to keep the Trusted Server powerless to snoop or tamper with the communication (see the Kerberos system)
The infrastructure of authentication most commonly used in the internet (often referred as PKI) is based on certificates, special files that carry public encryption keys and data that identifies and authenticates the owner. These certificates are issued by a Certificate Authority (CA) that (in theory) check the identity of the owner and, though a digital signature scheme embeds its own encryption keys in the certificate.
Certificates are akin to a token in the sense that anybody that has control of the certificate (a computer file) can impersonate the owner. Also certificates are as trustworthy as the CA is. If the procedures used by the CA to authenticate and identify the user are sloppy, the certificate itself is of little value.
Recent comments