views:

129

answers:

2

I'm using SQLCLR in my project and, for the record, I'm very pleased with it. However, I can't find any good sources of information for good data access patterns.

Usually I do communication between .net and SQL using stored procedures because I always want an API for my database. However, in this case the .net code is part of the API, so SPs seem bad.

Linq2SQL does not exist in SQL server (although it can be installed by doing things DBAs won't like), so it's not an option.

What I currently have is my code cluttered with standard ADO.NET code like

using (SqlCommand cmd = c.CreateCommand()) {
    cmd.CommandText = "SELECT ... FROM ...";
    using (SqlDataReader rdr = cmd.ExecuteReader()) {
        DoSomething(rdr);
    }
}

and, although, it works, it just seems very much like the wrong way to do it.

How do other people do it?

+1  A: 

I use XSLT to generate C# code for DAL. I usualy load the XML from the database definition itself (eg. some form of SELECT name, type, length, ... FROM sys.columns JOIN sys.tables JOIN sys.types FOR XML PATH) and I have developed in time custom XSLT transformation for code generation. I have them added as part of the Visual studio build process itself, similar to this blog: http://rusanu.com/2009/04/11/using-xslt-to-generate-performance-counters-code/ (the blog is about performance counters generaiton, but it applies just as well to DAL code). I also build types for the result sets using XSLT code gen.

While on heavy clients this approach overlaps with ORMs and ADO datasets functionality (I still use it on those clients, but that is my problem...), in SQLCR is imho the best fit, due to the SQLCLR specific limitations.

This approach is first and foremost very flexible, allowing me to quickly do global changes that affect every DAL entry point, while I keep absolute control over the code base (no external dependencies, no external introduced bugs). It is extreamly lightweight compared to ADO datasets.

I would only trade off this approach for LINQ, because of the added value of passing IQueryable objects around. But in SQLCLR, as you know, this is not yet a viable option.

Remus Rusanu
I actually do pretty much the same myself in non-SQLCLR, but my solution is dependent on expression trees (which are defined in System.Core) to generate where clauses. How do you generate your where clauses?
erikkallen
So far I never needed to generate complex where expressions. I always look up items by one of the keys and simple lookup methods (WHERE key@value) are generated from the XSLT. Judging from your answer to Jonathan you need something at runtime, not at compile time.
Remus Rusanu
No, in this case I'm interested in those simple queries, to look up the metadata I use in my complex logic.
erikkallen
This is what I ended up doing (well, not XSLT, but T4 templating). I hope I will open-source my solution some day. For the time being, I guess the appropriate response is "there is no good,standard way", which is essentially what this answer says.
erikkallen
A: 

For the limited places that SQLCLR actually provides a performance gain over proper set based TSQL, I do data access exactly how you have it shown above. You'd have to be doing some pretty heavy loop based processing that can't be done set based, XML parsing, or extremely complex math to really need to use SQLCLR. If you are using SQLCLR just for data-access, you are doing so at the cost of performance. If you'd like some demonstrations of this let me know and I'll pull my examples for AdventureWorks out of my Presentations from last year.

Jonathan Kehayias
My primary use for the SQLCLR is to generate SQL from a DSL using metadata in some tables. This would be incredibly cumbersome to do in T-SQL. My issues are not performance-related but related to maintainability and general good style.
erikkallen
If you are building your resulting TSQL dynamically in SQLCLR, you could do the same in TSQL and use sp_executesql to call it with parameterization in TSQL as well. The performance of sp_executesql would be faster than equivalent SQLCLR for the same operation in that case, and the code should be just as manageable. SQLCLR is a great tool to add performance, but it can also easily cost your performance wise for tasks that are only doing data access operations, despite having what you might consider "more friendly" code manageability wise.
Jonathan Kehayias
See Erland Sommarskog's article on Dynamic SQL which covers how to use dynamic code in TSQL to accomplish complexity like this:http://www.sommarskog.se/dynamic_sql.html
Jonathan Kehayias
@Jonathan: If you happen to have an ANTLR runtime for T-SQL I could perhaps consider it.
erikkallen