I've wrapped a dll method that has an integer as an out parameter in a web service. In testing I was finding that when I was expecting -1 I was getting 65,535 instead. I realised that the dll was using 16 bit integers and I was specifying the standard .NET 32bit integer when referencing the external dll in my code. this was quickly fixed by specifying a 16 bit integer and all is well.
My question is why did this happen? I could understand an overflow occuring if I was trying to fit a 32 bit integer in a 16 bit integer but I am not so sure why this happens the other way round. Clearly my understanding of this type of casting between types is a little lacking so any guidance will be greatly appreciated.