views:

128

answers:

4

We are trying to come up with our client and server standard and there is a big debate. One school of thought is C# client and java servers using some type of proprietary messsage library to share data objects (think XML like structure . .)

The issue with this model is that there is lots of code that needs to be duplicated (validation, parsing) that could simply be reused if you went with C# on the server as well. If there is a big push to use linux machine then wouldn't mono support your goal . .

anyone else have this dilemma?

A: 

I would try to keep the client and the server the same runtime (Java or CLR). We have had plenty of success using a mix of Mono and .Net client/Server.

Greg Dean
+4  A: 

There are frameworks for this. ICE (ZeroC), "protocol buffers", etc.

For example, my protocol buffers implementation (protobuf-net) works on mono, MS .NET, CF, Silverlight etc - and the binary format is compatible with a range of languages (java, etc). If you start from a .proto (a bespoke definition language), you can use it to generate the object layer in each language you need.

Marc Gravell
+1 for using the word bespoke
Greg Dean
+1  A: 

In any case I would use Protocol Buffers or something else defined in a language neutral form for the comms, to ensure that you aren't restricted in future.

Once you've done that you can start with C# on Mono. And then if that proves to be unworkable, you can switch to a different language.

Douglas Leeder
+1  A: 

I would suggest you use some old-but-good standard for transferring data between the two, SOAP and XML-RPC come to mind as examples for this. If either of these is possible or feasible, you may want to try JSON or writing your own XML format. Protocol buffers has been mentioned around lately quite a lot but I haven't looked into it so I really can't say anything about it.

Esko