views:

419

answers:

14

Hello everybody,

I'm considering the following: I have some data stream which I'd like to protect as secure as possible -- does it make any sense to apply let's say AES with some IV, then Blowfish with some IV and finally again AES with some IV?

The encryption / decryption process will be hidden (even protected against debugging) so it wont be easy to guess which crypto method and what IVs were used (however, I'm aware of the fact the power of this crypto chain can't be depend on this fact since every protection against debugging is breakable after some time).

I have computer power for this (that amount of data isn't that big) so the question only is if it's worth of implementation. For example, TripleDES worked very similarly, using three IVs and encrypt/decrypt/encrypt scheme so it probably isn't total nonsense. Another question is how much I decrease the security when I use the same IV for 1st and 3rd part or even the same IV for all three parts?

I welcome any hints on this subject :)

Thanks in advance!

+1  A: 

Who are you trying to protect your data from? Your brother, your competitor, your goverment, or the aliens?

Each of these has different levels at which you could consider the data to be "as secure as possible", within a meaningful budget (of time/cash)

Damien_The_Unbeliever
+10  A: 

I'm not sure about this specific combination, but it's generally a bad idea to mix things like this unless that specific combination has been extensively researched. It's possible the mathematical transformations would actually counteract one another and the end result would be easier to hack. A single pass of either AES or Blowfish should be more than sufficient.

UPDATE: From my comment below…

Using TripleDES as an example: think of how much time and effort from the world's best cryptographers went into creating that combination (note that DoubleDES had a vulnerability), and the best they could do is 112 bits of security despite 192 bits of key.

UPDATE 2: I have to agree with Diomidis that AES is extremely unlikely to be the weak link in your system. Virtually every other aspect of your system is more likely to be compromised than AES.

UPDATE 3: Depending on what you're doing with the stream, you may want to just use TLS (the successor to SSL). I recommend Practical Cryptography for more details—it does a pretty good job of addressing a lot of the concerns you'll need to address. Among other things, it discusses stream ciphers, which may or may not be more appropriate than AES (since AES is a block cipher and you specifically mentioned that you had a data stream to encrypt).

Hank Gay
+1. Unless the combination of algorithms being applied are specifically meant work together (a la the multiple transformations of Triple DES) then you may be weakening the security.Using the same key > once would be especially dubious as there may be patterns revealed which help expose the key.
Cowan
You can't actually make it _weaker_ using distinct IVs and keys - if so, that would be an effective attack against the single cipher. But it may not make it significiantly stronger.
Nick Johnson
I strongly agree with the 'use SSL' update. Crypto is rife with pitfalls. The SSL guys deal with these, so you don't have to.
slim
@Arachnid is right: you can't make it weaker by encrypting the cyphertext with a second cypher. The question is: why would you want to? Or perhaps: how much security is enough?
AJ
You can make a block cipher into a stream cipher based on how you implement it, see http://en.wikipedia.org/wiki/Block_cipher_modes_of_operation
Arc the daft
re: transforming a block cipher to a stream cipher. .NET includes a CryptoStream class that does exactly that, for AES and other block ciphers.
Cheeso
A: 

Also, don't waste time obfuscating the algorithm - apply Kirchoff's principle, and remember that AES, in and of itself, is used (and acknowledged to be used) in a large number of places where the data needs to be "secure".

Damien_The_Unbeliever
+3  A: 

I don't think you have anything to loose by applying one encryption algorithm on top of another that is very different from the first one. I would however be wary of running a second round of the same algorithm on top of the first one, even if you've run another one in-between. The interaction between the two runs may open a vulnerability.

Having said that, I think you're agonizing too much on encryption part. Most exposures of data do not happen by breaking an industry-standard encryption algorithm, like AES, but through other weaknesses in the system. I would suggest to spend more time on looking at key management, the handling of unencrypted data, weaknesses in the algorithm's implementation (the possibility of leaking data or keys), and wider system issues, for instance, what are you doing with data backups.

Diomidis Spinellis
A: 

Damien: you're right, I should write it more clearly. I'm talking about competitor, it's for commercial use. So there's meaningful budget available but I don't want to implement it without being sure I know why I'm doing it :)

Hank: yes, this is what I'm scared of, too. The most supportive source for this idea was mentioned TripleDES. On the other side, when I use one algorithm to encrypt some data, then apply another one, it would be very strange if the 'power' of whole encryption would be lesser than using standalone algorithm. But this doesn't mean it can't be equal... This is the reason why I'm asking for some hint, this isn't my area of knowledge...

Miro Kropacek
A: 

Diomidis: this is basically my point of view but my colleague is trying to convince me it really 'boosts' security. My proposal would be to use stronger encryption key instead of one algorithm after another without any thinking or deep knowledge what I'm doing.

Miro Kropacek
That is definitely the way to go. Using TripleDES as an example: think of how much time and effort from the world's best cryptographers went into creating that combination (note that DoubleDES had a vulnerability), and the best they could do is 112 bits of security despite 192 bits of key.
Hank Gay
A: 

I wouldn't rely on obscuring the algorithms you're using. This kind of "security by obscurity" doesn't work for long. Decompiling the code is one way of revealing the crypto you're using but usually people don't keep secrets like this for long. That's why we have private/public key crypto in the first place.

Mendelt
A: 

@Miro Kropacek - your colleague is trying to add security through Voodoo. Instead, try to build something simple that you can analyse for flaws - such as just using AES.

I'm guessing it was he (she?) who suggested enhancing the security through protection from debugging too...

Damien_The_Unbeliever
A: 

You can't actually make things less secure if you encrypt more than once with distinct IVs and keys, but the gain in security may be much less than you anticipate: In the example of 2DES, the meet-in-the-middle attack means it's only twice as hard to break, rather than squaring the difficulty.

In general, though, it's much safer to stick with a single well-known algorithm and increase the key length if you need more security. Leave composing cryptosystems to the experts (and I don't number myself one of them).

Nick Johnson
+4  A: 

A hacker will always attack the weakest element in a chain. So it helps little to make a strong element even stronger. Cracking an AES encryption is already impossible with 128 Bit key length. Same goes for Blowfish. Choosing even bigger key lengths make it even harder, but actually 128 Bit has never been cracked up to now (and probably will not within the next 10 or 20 years). So this encryption is probably not the weakest element, thus why making it stronger? It is already strong.

Think about what else might be the weakest element? The IV? Actually I wouldn't waste too much time on selecting a great IV or hiding it. The weakest key is usually the enccryption key. E.g. if you are encrypting data stored to disk, but this data needs to be read by your application, your application needs to know the IV and it needs to know the encryption key, hence both of them needs to be within the binary. This is actually the weakest element. Even if you take 20 encryption methods and chain them on your data, the IVs and encryption keys of all 20 needs to be in the binary and if a hacker can extract them, the fact that you used 20 instead of 1 encryption method provided zero additional security.

Since I still don't know what the whole process is (who encrypts the data, who decrypts the data, where is the data stored, how is it transported, who needs to know the encryption keys, and so on), it's very hard to say what the weakest element really is, but I doubt that AES or Blowfish encryption itself is your weakest element.

Mecki
Good cryptographers never say "impossible". "Computationally infeasable given current cryptanalysis techniques", perhaps. ;)
Nick Johnson
We cannot say for sure that an AES 128 Bit encryption can be cracked by brute force at all. We assume that it can be cracked, we can mathematically prove that it can be cracked, but we only know for sure the day it has been cracked ;-) "I have only proved it correct, not tried it." - Donald Knuth
Mecki
Ummm... Yeah, we can say for sure that an AES-128 encryption can be cracked by brute force. Enumerate all possible keys (that's what "brute force" means) and one of them will decrypt. This is trivial to prove. It is, of course, utterly non-trivial to actually DO, but that's a different issue.
JUST MY correct OPINION
@ttmrichter: On the other hand, if AES is ever broken by a method that is much easier than a brute force attack, I doubt that a longer key will protect you. If there is a trivial way to break AES 128, I'm pretty sure this way can be applied to AES 256.
Mecki
A: 

Encrypting twice is more secure than encrypting once, even though this may not be clear at first.

Intuitively, it appears that encrypting twice with the same algorithm gives no extra protection because an attacker might find a key which decrypts all the way from the final cyphertext back to the plaintext. ... But this is not the case.

E.g. I start with plaintext A and encrypt with key K1 it to get B. Then I encrypt B with key K2 to get C.

Intuitively, it seems reasonable to assume that there may well be a key, K3, which I could use to encrypt A and get C directly. If this is the case, then an attacker using brute force would eventually stumble upon K3 and be able to decrypt C, with the result that the extra encryption step has not added any security.

However, it is highly unlikely that such a key exists (for any modern encryption scheme). (When I say "highly unlikely" here, I mean what a normal person would express using the word "impossible").

Why?
Consider the keys as functions which provide a mapping from plaintext to cyphertext.
If our keys are all KL bits in length, then there are 2^KL such mappings.
However, if I use 2 keys of KL bits each, this gives me (2^KL)^2 mappings.
Not all of these can be equivalent to a single-stage encryption.

Another advantage of encrypting twice, if 2 different algorithms are used, is that if a vulnerability is found in one of the algorithms, the other algorithm still provides some security.

As others have noted, brute forcing the key is typically a last resort. An attacker will often try to break the process at some other point (e.g. using social engineering to discover the passphrase).

Another way of increasing security is to simply use a longer key with one encryption algorithm.

...Feel free to correct my maths!

AJ
I'm sorry, but thats just not true.
Tnilsson
Which bit? That encrypting with 2 algorithms improves security? It introduces an extra step. An attacker has to be able to crack both steps in order to decrypt. It is more secure.
AJ
I was operating under an inccorect assumption regarding DES. I take my comment back and deleted my post that contained the incorrect material. My apologies
Tnilsson
Cool. Thanks. :)
AJ
> However, if I use 2 keys of KL bits each, this gives me (2^KL)^2 mappings. <--- this is not known *a priori*. Consider the case of the caesar shift cipher: by adding K2 to (P + K1) all you get is (P + K) where K = K1 + K2. So the two encryptions could be done as one encryption with key K.
Jonas Kölker
That's right. Specific encryption schemes, such as the Caesar Cypher are vulnerable. However, that's a special case because of the linear nature of the mappings. This is explained better than I can in episode 125 of "Security Now", here: http://www.grc.com/securitynow.htm, in case that helps.
AJ
A: 

Yes, it can be beneficial, but probably overkill in most situations. Also, as Hank mentions certain combinations can actually weaken your encryption.

TrueCrypt provides a number of combination encryption algorithms like AES-Twofish-Serpent. Of course, there's a performance penalty when using them.

Gordon Bell
A: 

Changing the algorithm is not improving the quality (except you expect an algorithm to be broken), it's only about the key/block length and some advantage in obfuscation. Doing it several times is interesting, since even if the first key leaked, the resulting data is not distinguishable from random data. There are block sizes that are processed better on a given platform (eg. register size).

Attacking quality encryption algorithms only works by brute force and thus depending on the computing power you can spend on. This means eventually you only can increase the probable average time somebody needs to decrypt it.

If the data is of real value, they'd better not attack the data but the key holder...

Oli
A: 

I agree with what has been said above. Multiple stages of encryption won't buy you much. If you are using a 'secure' algorithm then it is practically impossible to break. Using AES in some standard streaming mode. See http://csrc.nist.gov/groups/ST/toolkit/index.html for accepted ciphers and modes. Anything recommended on that site should be sufficiently secure when used properly. If you want to be extra secure, use AES 256, although 128 should still be sufficient anyway. The greatest risks are not attacks against the algorithm itself, but rather attacks against key management, or side channel attacks (which may or may not be a risk depending on the application and usage). If you're application is vulnerable to key management attacks or to side channel attacks then it really doesn't matter how many levels of encryption you apply. This is where I would focus your efforts.