views:

63

answers:

1

I've been working on a project where I need to transfer large volumes of Binary data some are Structs, enums, lists etc and some are pure binary data, I'm trying to develop a strategy for dealing with the issues when transferring the data between a 64bit compile and a 32bit compile.

Presently I mark the assembly to just target 32bit to avoid the issue, but I want to open up and use the 64bit/32bit by targetting Any.

I know I "could" use serialization, but this adds quite a lot of additional data to the transfer and because these transfers can travel across VPN etc, it was decided to Not use pure .Net serialization due to the additional overhead added.

Therefore when you do this process would it be best to:

  1. Add 1 Byte to EACH Block of data that is transferred specifying the bitsize that was used to encode the data on the initial end
  2. Add 1 Byte that is used in the Initial Handshake between the systems to specify the bitsize that will be used for all communication
  3. Don't use default datatypes (integer, short, etc) and instead Encode them in the data so that the transfer is not dependent upon System varying datatype sizes.
+1  A: 

.Net guarantees that an Int32 will always be 32 bits:

The Int32 value type represents signed integers with values ranging from negative 2,147,483,648 through positive 2,147,483,647.

And int is simply an alias for the System.Int32 type. The same holds true for the other value types in .Net (byte, short, long, etc).

But if you'd like proof, here's the output of:

Console.WriteLine(int.MaxValue);
Console.WriteLine(sizeof(int));

running on my machine, under 64-bit Windows Server 2008, targeting x64:

2147483647
4
Michael Petrotta
Thanks, just what I needed
Paul Farry