views:

75

answers:

4

I have SQL Server table without any indexes, and i can not add them. There are millions of records in that table, and i can not get all records with single query because of insufficient memory. How can i get all records in small portions - for example 100 records per portion?

A: 

The one solution is - to select all table that i need to dump, into temporary table with identity column. It is... acceptable solution, since i have separate application and database servers, and DB server has enough memory for this. But probably there is more effective solution?

+1  A: 

How are you attempting to read the records? If you use a SqlDataReader, then you should not have any out of memory problems.


const string QUERY = "SELECT * FROM MyTable";
using (var conn = new SqlConnection(CONNECTION_STRING))
{
    using (var cmd = new SqlCommand(QUERY, conn))
    {
        using (var reader = cmd.ExecuteReader())
        {
            while (reader.Read())
            {
                // Use properties and methods of the reader to access the current row
            }
        }
    }
}

This passes you one row at a time across the database connection. The connection will do the buffering for you, bringing multiple rows from the database, and passing them to you one at a time.

John Saunders
This does not work for big tables (with some millions of records - exception saying that "the table is too big" is thrown.
Please post the complete exception. Post ex.ToString().
John Saunders
+1  A: 

This is one of the rare cases when a cursor can be used on its actual purpose.

You can create a statiс cursor (which will create a copy of the table data, but in the server's temporary db rather than on the client side, and sort it) and browse this cursor.

Quassnoi
A: 

One solution is to use a server side cursor, as Quassnoi already suggested. The drawback of that is that it will have to be a static cursor which means a copy of the entire table will be created in tempdb.

Another solution is to process the client side in a stream oriented fashion, row by row (ie. client side cursor):

 using(SqlDataReader rdr = cmd.ExecuteReader(CommandBehavior.SequentialAccess))
 {
   while (rdr.Read())
   {
     -- process one row here
   }
 }

This way you can process a big row set without burning out all your client side memory.

Remus Rusanu