views:

172

answers:

3

I'm working with an older version of OpenSSL, and I'm running into some behavior that has stumped me for days when trying to work with cross-platform code.

I have code that calls OpenSSL to sign something. My code is modeled after the code in ASN1_sign, which is found in a_sign.c in OpenSSL, which exhibits the same issues when I use it. Here is the relevant line of code (which is found and used exactly the same way in a_sign.c):

EVP_SignUpdate(&ctx,(unsigned char *)buf_in,inl);

ctx is a structure that OpenSSL uses, not relevant to this discussion
buf_in is a char* of the data that is to be signed
inl is the length of buf_in

EVP_SignUpdate can be called repeatedly in order to read in data to be signed before EVP_SignFinal is called to sign it.

Everything works fine when this code is used on Ubuntu and Windows 7, both of them produce the exact same signatures given the same inputs.

On OS X, if the size of inl is less than 64 (that is there are 64 bytes or less in buf_in), then it too produces the same signatures as Ubuntu and Windows. However, if the size of inl becomes greater than 64, it produces its own internally consistent signatures that differ from the other platforms. By internally consistent, I mean that the Mac will read the signatures and verify them as proper, while it will reject the signatures from Ubuntu and Windows, and vice versa.

I managed to fix this issue, and cause the same signatures to be created by changing that line above to the following, where it reads the buffer one byte at a time:

int input_it;
for(input_it = (int)buf_in; input_it < inl + (int)buf_in; intput_it++){
  EVP_SIGNUpdate(&ctx, (unsigned char*) input_it, 1);
}

This causes OS X to reject its own signatures of data > 64 bytes as invalid, and I tracked down a similar line elsewhere for verifying signatures that needed to be broken up in an identical manner.

This fixes the signature creation and verification, but something is still going wrong, as I'm encountering other problems, and I really don't want to go traipsing (and modifying!) much deeper into OpenSSL.

Surely I'm doing something wrong, as I'm seeing the exact same issues when I use stock ASN1_sign. Is this an issue with the way that I compiled OpenSSL? For the life of me I can't figure it out. Can anyone educate me on what bone-headed mistake I must be making?

+1  A: 

This is likely a bug in the MacOS implementation. I recommend you file a bug by sending the above text to the developers as described at http://www.openssl.org/support/faq.html#BUILD17

levis501
A: 

The most common issue porting Windows and Linux code around is default values of memory. I think Windows sets it to 0xDEADBEEF and Linux set's it to 0s.

Dustin
A: 

There are known issues with OpenSSL on the mac (you have to jump through a few hoops to ensure it links with the correct library instead of the system library). Did you compile it yourself? The PROBLEMS file in the distribution explains the details of the issue and suggests a few workarounds. (Or if you are running with shared libraries, double check that your DYLD_LIBRARY_PATH is correctly set). No guarantee, but this looks a likely place to start...

jsegal