views:

34

answers:

1

In Java, I'm generating and serializing a symmetric key for encryption purposes:

    KeyGenerator keyGen = KeyGenerator.getInstance(algorithm);
    SecretKey symmetricKey = keyGen.generateKey();
    Base64.encode(symmetricKey.getEncoded(), new FileOutputStream(filename));    

where Base64 is from the Bouncycastle cryptography package and algorithm is AES.

The key, when used with Oracle (Sun) JVM 1.6.0_21, works perfectly is moved from, e.i, Windows to Linux (even between 32/64 bits OSs).

On OS X (Intel), with Apple's JVM, the key is loaded without exception but every string encrypted on Windows or Linux generates a BadPaddingException.

A string is encoded with the following code:

    Cipher cipher = Cipher.getInstance(algorithm, "BC");
    cipher.init(Cipher.ENCRYPT_MODE, secretKey);
    encryptedString = new String(Base64.encode(cipher.doFinal(string.getBytes())));

where algorithm is AES.

Any clues?

+1  A: 

Padding has nothing to do with the key.

What padding algorithm are you specifying when creating the Cipher?


If you are literally using just "AES" as the cipher algorithm, you should be explicit about the mode and padding. Otherwise, the crypto provider is free to choose some default of its own, and that's likely to vary from machine to machine.

erickson
I added the encryption code to my original post.
Alessandro Baldoni