I have a situation where a certain linq query that runs against a multi-million row table is taking far to long to run. I dissected the linq query output and found that it was creating the parameters for the where clause using the wrong data type. For instance, one field was defined as a Char(12) in the database, but the parameter it was comparing against was declared as NVarChar(12). Once I changed the query to use Char instead of NVarChar, it ran sub-second as it should. Is there a way to get linq to sql to use the correct data type as defined in the .dbml file for that column? I double checked and its defined as DbType="Char(12)" in the data context .dbml file.
views:
109answers:
1You can get the command and reset the types of the parameters on it directly (to ansi-string in your case).
http://msdn.microsoft.com/en-us/library/system.data.linq.datacontext.getcommand.aspx http://msdn.microsoft.com/en-us/library/system.data.dbtype.aspx
Then you might call ExecuteReader on that command, yielding a DbDataReader. You can hand this DbDataReader to the Translate method of your datacontext, and it will give you the IEnumerable<T>
that you'd expect from linq.
http://msdn.microsoft.com/en-us/library/bb534213.aspx
The performance problem is caused by the query parameter having a different type than the index selected by the query optimizer. What happens next is that the entire index is converted into the type of the parameter. This is done each time the query is issued - the conversion doesn't hang around for later querying.
I usually see this behavior when sending a collection of strings into the database:
//this query will get correct parameter type
db.Customers.Where(c => c.Name == "Bob")
//this query can get incorrect parameter type
List<string> names = new List<string>(){"Amy", "Bob"};
db.Customers.Where(c => names.Contains(c.Name));