I'm using ASP.NET MVC and Azure Table Storage in the local development fabric. My pagination code is very slow when working with a large resultset:
var PageSize = 25;
var qResult2 = from c in svc.CreateQuery<SampleEntity>(sampleTableName)
where c.PartitionKey == "samplestring"
select c;
TableStorageDataServiceQuery<SampleEntity> tableStorageQuery =
new TableStorageDataServiceQuery<SampleEntity>
(qResult2 as DataServiceQuery<SampleEntity>);
var result = tableStorageQuery.ExecuteAllWithRetries()
.Skip((page - 1) * PageSize)
.Take(PageSize);
var numberOfEntities = tableStorageQuery.ExecuteAllWithRetries().Count
ViewData["TotalPages"] = (int)Math.Ceiling((double) numberOfEntities / PageSize);
ViewData["CurrentPage"] = page;
return View(result);
The ViewData is used by the View to calculate paging links using code from Sanderson's MVC book. For an Azure Table with 1000+ entities, this is very slow. For starters, "Count" takes quite a long time to calculate the total number of entities. If I'm reading my LINQ book correctly, this is because the query doesn't implement ICollection. The book is "Pro LINQ" by Joseph Rattz.
Even if I set "numberOfEntities" to the known total (e.g. 1500), the paging is still slow for pages above 10. I'm guessing that .Skip and/or .Take are slow. Also, I call ExecuteAllWithRetries() twice, and that can't be helping if in fact Azure is queried twice.
What strategy should I follow for paging through large datasets with ASP.NET MVC and Azure?
EDIT: I don't need to know the exact total number of pages.