Silk Road forums

Discussion => Security => Topic started by: dmc002 on August 24, 2013, 10:50 pm

Title: Encryption is less secure than originally thought
Post by: dmc002 on August 24, 2013, 10:50 pm
Found this on the clearnet and thought it was interesting.

Quote
Traditionally, information-theoretic analyses of secure schemes have assumed that the source files are perfectly uniform. In practice, they rarely are, but they're close enough that it appeared that the standard mathematical analyses still held.
The problem, Médard explains, is that information-theoretic analyses of secure systems have generally used the wrong notion of entropy. They relied on so-called Shannon entropy, named after the founder of information theory, Claude Shannon, who taught at MIT from 1956 to 1978. Shannon entropy is based on the average probability that a given string of bits will occur in a particular type of digital file. In a general-purpose communications system, that's the right type of entropy to use, because the characteristics of the data traffic will quickly converge to the statistical averages. Although Shannon's seminal 1948 paper dealt with cryptography, it was primarily concerned with communication, and it used the same measure of entropy in both discussions.
But in cryptography, the real concern isn't with the average case but with the worst case. A codebreaker needs only one reliable correlation between the encrypted and unencrypted versions of a file in order to begin to deduce further correlations. In the years since Shannon's paper, information theorists have developed other notions of entropy, some of which give greater weight to improbable outcomes. Those, it turns out, offer a more accurate picture of the problem of codebreaking.
When Médard, Duffy and their students used these alternate measures of entropy, they found that slight deviations from perfect uniformity in source files, which seemed trivial in the light of Shannon entropy, suddenly loomed much larger. The upshot is that a computer turned loose to simply guess correlations between the encrypted and unencrypted versions of a file would make headway much faster than previously expected.
"It's still exponentially hard, but it's exponentially easier than we thought," Duffy says. One implication is that an attacker who simply relied on the frequencies with which letters occur in English words could probably guess a user-selected password much more quickly than was previously thought. "Attackers often use graphics processors to distribute the problem," Duffy says. "You'd be surprised at how quickly you can guess stuff."



link (clearnet) http://phys.org/news/2013-08-encryption-thought.html#jCp
Title: Re: Encryption is less secure than originally thought
Post by: Euphoric on August 25, 2013, 03:27 am
I'm a bit fucked up right now, but if I'm reading this correctly right now...they need an un-encrypted file to make it easier to de-crypt an encrypted file? Well trust me i don't have un-encrypted files of what I have encrypted. Everything I have is encrypted more than two times in different forms of encryption.

Anyways, with 4096-bit encryption it would takes YEARS to de-crypt any data.
Title: Re: Encryption is less secure than originally thought
Post by: dmc002 on August 25, 2013, 06:02 am
What I got from the article, and I may be wrong as I'm not a computer scientist, is that they don't actually need a copy of the unencrypted file. They just need an idea of what the unencrypted data would look like. They then use this information to refine their attack.

A very simple example would be if someone had encoded a written message with a Caesar cipher. Since we know that e is the most used letter in the English language we can usually assume that the most used letter in the encrypted message will correspond with the letter e in the unencrypted message.

 Now this only works for the simplest Caesar cipher but the idea behind it can be applied to much more complicated problems. Assuming the attacker knows quite a bit about the kind of data he is trying to decrypt he can refine his attack to look for patterns that show up regularly in that type of data. And while more sophisticated encryption methods are still incredibly hard to crack, it turns out using these refinements can make it much easier.

Quote
Everything I have is encrypted more than two times in different forms of encryption.
Anyways, with 4096-bit encryption it would takes YEARS to de-crypt any data.

But yeah I don't think you have anything to worry about.
Title: Re: Encryption is less secure than originally thought
Post by: kmfkewm on August 25, 2013, 07:41 am
The news article reporting on this is too general to really figure out exactly what the hell is going on, the paper it references to is far too technical and mathematic for me to figure out what the hell is going on. The only thing I gather from this is that user generated non-random passwords are probably indeed easier to guess than we previously thought, I am not convinced this has any implications for actual encryption but I cannot figure it out either way.

Yay looks like Bruce Schneier has something to say on it:

Quote
There have been a bunch of articles about an information theory paper with vaguely sensational headlines like "Encryption is less secure than we thought" and "Research shakes crypto foundations." It's actually not that bad.

Basically, the researchers argue that the traditional measurement of Shannon entropy isn't the right model to use for cryptography, and that minimum entropy is. This difference may make some ciphertexts easier to decrypt, but not in ways that have practical implications in the general case. It's the same thinking that leads us to guess passwords from a dictionary rather than randomly -- because we know that humans both created the passwords and have to remember them.

This isn't news -- lots of cryptography papers make use of minimum entropy instead of Shannon entropy already -- and it's hard to see what the contribution of this paper is. Note that the paper was presented at an information theory conference, and not a cryptography conference. My guess is that there wasn't enough crypto expertise on the program committee to reject the paper.

So don't worry; cryptographic algorithms aren't going to come crumbling down anytime soon. Well, they might -- but not because of this result.