views:

2097

answers:

4

There are a few tutorials on the web that describe consuming a Web Service using SQL Server 2005's CLR integration. For the most, the process seems pretty convoluted. I've run into several issues including the need to change my database's trust level, and using the sgen tool to create a static XmlSerializer assembly; and I still haven't gotten it working right... (I'm sure I just need to put a little more time and energy into it)

What are the security, performance, and maintenance implications when going to this type of architecture? This would likely be a fairly heavily-used process, and ease of maintenance is relatively important.

I do have freedom to choose whether to integrate this into SQL Server as a UDF, or have it be a stand alone .NET library for console/Web applications. Is the SQL CLR integration with external assemblies worth the trouble?

+1  A: 

I think you have answered your own question, I personally find that anything calling a WebService is more than likley better suited to exist OUTSIDE of SQL Server. The complications, elevated trust levels, and as you mentioned overall convoluted process makes it a hard to document and hard to maintain solution.

Mitchel Sellers
+2  A: 

The short answer is, no, SQL CLR Integration is probably not worth the trouble.

The longer answer has several points, beginning with programming CLR in the database. It's a fine tool, when used correctly, but it does increase memory consumption and can lead to performance issues if not done correctly. I use it in my database for very specialized functionality, such as adding RegEx ability, but it's used sparingly, with well-tested code to prevent as many issues as possible from cropping up.

A second is, as you pointed out, you've got to modify security, opening up potential risks.

Use a stand alone application to load the data into your server. You'll have more control, less risk and a much easier time of it.

Josef
A: 

I have been doing clr procedures which calls webservices both on Exchange and AD and I agree to the posts above. It works, but we quickly ran into out-of-memory issues because of the special way memory is handled in CLR inside sql server. As you can imagine performance is ok for small queries but does not scale at all.

Generally your database performance determines the performance of your application and I think putting such logic in your database is a no-no if you don't have completely control over what you are doing.

Use CLR for simple text manipulations and other calculations that does not depend on external resources.

TT
A: 

I think I am at the right place to put this question.

I followed all those tutorials to develop my code that consumes the webservice and INSERTS data into SQL Server table. And as you guys mentioned .... I am running into memory issues now.

I just want to change my architecture to get rid of this CLR thing but don't know how to do that. Below is what I have done till yet.

  • Used WSDL.exe to generate source code(proxy classes) from WSDL files
  • Developed my own code to call methods in those proxy classes to retrieve data
  • Compiled my own code and proxy classes in a single DLL
  • Generated what they call serialization for DLL
  • Deployed both DLLs as ASSEMBLIES in SQL SERVER

In my code I stored extracted data in an ArrayList and now traversing that ArrayList to insert records into SQL Server tables using SqlConnection and SqlCommand objects.

When I try to extract above 9000 records from webservice and load into SQL Server table, it gives me below error:

NET Framework execution was aborted by escalation policy because of out of memory.

Please help me to get rid of this issue, either by some change in architecture or some tweak ... whatever it takes to get rid of it. I have read about this issue, they say that its due to CLR memory in SQL Server.

Please pass on some pointers guys.