views:

201

answers:

3

I've been working on a system that uses asymmetric encryption in a large number of files. I'm currently using RSA with 4096-bit keys to encrypt a 256-bit randomly generated AES key for each file, but performance is somewhat lacking, as one required operation is to scan through all the files (estimated number when the system is in use is around 10,000) and identify which ones can be decrypted using a specific private key. While I don't expect this operation to be instant, it is taking too long at the moment (~2 files processed per second). I considered reducing the key length, but even taking it down to 2048 bits doesn't provide the level of performance I need. 512 bits would just about cut it, but as such keys can now be cracked trivially that is out of the question.

Can anybody point me in the direction of a system that is faster but of similar cryptographic strength? It would need to be implemented via a Java JCA provider (e.g. something like bouncycastle) in order to plug in to my existing application neatly. I know bouncy castle supports El Gamal, but I can't find any details on how strong this algorithm is, or if it is even likely to be any faster than RSA. I also hear about elliptic curve systems that only need relatively short keys (384 bits or the like), but don't know where to find an implementation of one of these.

+2  A: 

Why don't you calculate a cryptographically strong hash of each key, and then store that in the clear with each filename? Then, given a key that you need to match against all the files, you can simply hash the key and look it up in the table.

Mox
or you could store the public key that it was encrypted with (or the hash of it)
cobbal
one of my requirements is that you can't tell what key a file was encrypted with unless you have the password to decrypt that key; I think that prevents either of these solutions.
A: 

I'd go for an approach that requires less RSA operations. SSL/TLS, although they use RSA etc for encrypting AES etc keys, do not use AES for the data simply because it is a computationally expensive operation at sufficiently large key sizes for security to be done on a per-packet, or in your case, per-file basis.

Another public key system is: http://en.wikipedia.org/wiki/ElGamal_encryption. Security-wise I believe it has yet to be broken but would personally put my trust in RSA for now. I do not know if there are any elliptic curve encryption algorithms currently available - that is to say I know they are being researched but understand they may not be ready for production use and I heard there were patent issues.

Ninefingers
+2  A: 

For your question as asked, try Diffie-Hellman over elliptic curves, also known as "ECDH". Estimating security is a bit difficult once we deal with sizes that cannot be cracked with current technology, since this depends on how we bet on future technological evolutions. Yet one can say that ECDH over the P-256 curve provides "128 bits" of security, a level which is similar to what you would get from 2048-bit RSA. That level is widely sufficient for all current usages, or, more appropriately said, if P-256 is not enough for you then your problem has very special needs and cryptographic strength is likely to be the least of your worries.

On my PC (a 2.4 Ghz Intel Core2, 64-bit mode, running Linux), OpenSSL claims to crunch out about 900 ECDH instances per second, using a single core.

Edit: for estimation of key security, depending on the length, for several algorithms, see this site.

Thomas Pornin
Sounds ideal, and checking it with openssl on my machine I get slightly slower results, but still adequate. Now to find an implementation I can use. :)
Well, OpenSSL *is* also a library, open source and all, ready to be integrated in various applications...
Thomas Pornin
Unfortunately, there don't appear to be any actively maintained bindings to Java. I used bouncycastle in the end, which is slower than openssl but did perform well enough.