I am looking at implementing a paging strategy for a domain model who's numbers are in the hundreds of thousands. I am most interested in how websites that are performance conscience achieve this.
A:
The answer would have to be server-side pagination if you're looking for a performant, scalable way of handling hundreds of thousands of items.
Think about how a Google search works; it's exactly the same sort of problem.
Did you have a more specific question?
Matt Ball
2010-05-05 20:39:09
+1
A:
Here is what I use in a SQL Server 2008 table that has 2 billion + rows of data (I changed the table and column names)
it takes between 6 and 10 milliseconds to do a page of 50 rows, 5000 rows per page takes about 60 milliseconds
; with cte as(select ROW_NUMBER()over(order by Column1,Column2) as RowNumber,
<<Other columns here>>
from Table1 p
join Table2 i on i.ID = p.ID
and i.ID2 = p.ID2
join dbo.Table3 c on i.ID2 = p.ID2
where Column2 between @StartDate and @EndDate
and p.ID = @ID
)
select *,(select MAX(RowNumber) from cte) as MaxRow
from cte
where RowNumber between @StartRow and (@StartRow + @RowsPerPage) -1
order by Col3
I use page level compression on the DB, narrow tables and indexes also
That is on the database, on the website we use a ExtJS grid and it is just Ajax calls to the service which calls the DB
SQLMenace
2010-05-05 20:42:38