views:

351

answers:

3

I have the following code snippet:

    var matchingAuthors = from authors in DB.AuthorTable
                          where m_authors.Keys.Contains(authors.AuthorId)
                          select authors;

    foreach (AuthorTableEntry author in matchingAuthors)
    {
        ....
    }

where m_authors is a Dictionary containing the "Author" entries, and DB.AuthorTable is a SQL table. When the size of m_authors goes beyond a certain value (somewhere around the 3000 entries mark), I get an exception:

System.Data.SqlClient.SqlException: The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. 
Too many parameters were provided in this RPC request. The maximum is 2100.

Is there any way I can get around this and work with a larger size dictionary? Alternatively, is there a better way to get all rows in a SQL table where a particular column value for that row matches one of the dictionary entries?

+2  A: 

LINQ to SQL uses a parametrized IN statement to perform a local Contains():

...
WHERE AuthorId IN (@p0, @p1, @p2, ...)
...

So the error you're seeing is that SQL ran out of parameters to use for your keys. I can think of two options:

  1. Select the whole table and filter using LINQ to Objects.

  2. Generating an expression tree from your keys: see Option 2 here.

dahlbyk
A: 

Another option is to consider how you populate m_authors and whether you can include that in the query as a query element itself so it turns into a server-side join/subselect.

DamienG
A: 

Depending on your requirements, you could break apart the work into multiple smaller chunks (first thousand, second thousand, etc.) This runs certain risks if your data is read-write and changes frequently, but it might give you a bit better scalability beyond pulling back thousands of rows in one big gulp. And, if your data can be worked on in part (i.e. without having the entire set in memory), you could send off chunks to be worked on in a separate thread while you are pulling back the next chunk.

JP Alioto