Hello,
I'm having a problem using a WCF call from a Windows service to my WCF service running on my web server. This call has been working for a number of weeks, but then stopped working all of a sudden, and has not worked since.
The exception I'm getting is:
"General Error Occurred System.ServiceModel.CommunicationException: An error occurred while making the HTTP request"
and then it says "This could be due to the fact that the server certificate is not configured properly with HTTP.SYS in the HTTPS case. This could also be caused by a mismatch of the security binding between the client and the server."
The security I'm using on both ends is wsHttpBinding, without any kind of encryption. It also is just using HTTP - not HTTPS, so I'm not sure why it's complaining about HTTPS.
The rest of the inner exception stack is:
"SystemNet.WebException: The underlying connection was closed: An unexpected error occurred on a send. ---> System.IO.IOException: Unable to write data to the transport connection: An invalid argument was supplied. ---> System.Net.Sockets.SocketException: An invalid argument was supplied at System.Net.Sockets.Socket.MultipleSend(BufferOffsetSize[] buffers, SocketFlags socketFlags) at System.Net.Sockets.NetworkStream.MultipleWrite(BufferOffsetSize[] buffers)"
I should also note that the point in my program where this occurs is on the "Execute" line of the call to the web service - that is, right as soon as I call the web service and pass it the wrapped up DataContract object, it blows up.
All this service is doing is getting passed a large amount of XML (passed as a .NET object to the call on the client side), which it then does some work with. Probably about 100-200k of XML is being transmitted. I've raised the limits for the data sizes on both ends to over 6 megs, but that didn't seem to help.
Any ideas?
Some more information on this issue:
When we duplicate the client environment locally, we find that we cannot upload large amounts of XML unless we make the following changes: 1. On the server, set the "maxRequestLength" to 100 MB (way higher than we are sending) 2. On the client, we set the value of maxItemsInObjectGraph under the dataContractSerializer tag to "2147483646".
With these changes, our local installation uploads successfully. However, the client's install on their server still fails. What interesting to note is that once we made the maxRequestLength value change on the server, our test installation started throwing an error specifically relating to the maxItemsInObjectGraph setting. Whereas on our client's server, still the original "HTTP.sys" error is happening.
As I noted before, we are not using SSL at all, and there are 2 other web services calls that execute and upload XML in the same way. However, since the non-working service call transmits more data, this appears to be a size issue.
However, if the issue the client is having were the same one our test install had, I don't get why the client error message wouldn't be related to the ObjectGraph error.
Is it possible that we're just getting the generic "invalid parameter" "HTTP.sys" error for every possible error on the client (ie. it's really getting the objectGraph error too, but just isn't showing it?)
Thanks!