I have an out-of-process COM server written in C++, which is called by some C# client code. A method on one of the server's interfaces returns a large BSTR to the client, and I suspect that this is causing a memory leak. The code works, but I am looking for help with marshalling-out BSTRs.
Simplifying a bit, the IDL for the server method is
HRESULT ProcessRequest([in] BSTR request, [out] BSTR* pResponse);
and the implementation looks like:
HRESULT MyClass::ProcessRequest(BSTR request, BSTR* pResponse)
{
USES_CONVERSION;
char* pszRequest = OLE2A(request);
char* pszResponse = BuildResponse(pszRequest);
delete pszRequest;
*pResponse = A2BSTR(pszResponse);
delete pszResponse;
return S_OK;
}
A2BSTR internally allocates the BSTR using SysAllocStringLen().
In the C# client I simply do the following:
string request = "something";
string response = "";
myserver.ProcessRequest(request, out response);
DoSomething(response);
This works, in that request strings get sent to the COM server and correct response strings are returned to the C# client. But every round trip to the server leaks memory in the server process. The crt leak detection support is showing no significant leaks on the crt heap, so I'm suspecting the leak was allocated with IMalloc.
Am I doing anything wrong here? I have found vague comments that 'all out parameters must be allocated with CoTaskMemAlloc, otherwise the interop marshaller won't free them' but no details.
Andy