I have been working to create WCF services that will operate independent of .Net clients. Thanks to Google and StackOverflow, I have been able to create both simple xml and json services without Soap wrappers and a bunch of fancy WCF stuff that I just don't need. It has been a painful experience, hence the subject line of this question. WCF is mad buggy on the client side when using WebGet and WebInvoke when automatically adding the service reference.
To inspect the communication, I've been creating a WCF client locally and passing everything through Fiddler. That way, whether it works or not, I can at least see what the client is trying to send. And when it finally does work, I can see the data being sent from both ends and then duplicate this communication in a non-.Net client.
My current problem is that when I change the service to expect POST data as json (enableWebScript behavior), the client has no idea, and it still tries to send the objects as xml. I've had a ton of issues with the client's config not automatically being set properly when using Add Service Reference, so I'm hoping it's something simple I can add to the app.config on the client. When using XML, the objects I create and use in the service are automatically xml serialized by the client (which is most convenient). Is that even possible to do as json in the current version of WCF?
It should be noted that I was able to figure out what I need to do manually and get it to work in a raw form with Fiddler (request builder), so I can serialize my objects in code and send the data manually via http post...that's how I'm doing it in my non-.Net clients anyway. This is more of a question to understand the WCF aspects better and why I'm missing so many attributes on the client side where there's little to no documentation available to address the issues.