tags:

views:

175

answers:

6

Ok here is the situation (using PHP/MySQL) you are getting results from a large mysql table, lets say your mysql query returns 10,000 matching results and you have a paging script to show 20 results per page, your query might look like this

So page 1 query

SELECT column 
FROM table_name 
WHERE userId=1 
AND somethingelse=something else
LIMIT 0,20

So page 2 query

SELECT column 
FROM table_name 
WHERE userId=1 
AND somethingelse=something else
LIMIT 20,40

Now you are grabbing 20 results at a time but there are a total of 10,000 rows that match your search criteria,

How can you only return 3,000 of the 10,000 results and still do your paging of 20 per page with a LIMIT 20 in your query?

I thought this was impossible but myspace does it on there browse page somehow, I know they aren't using php/mysql but how can it be achieved?

UPDATE

I see some people have replied with a couple of methods, it seems none of these would actually improve the performance by limiting the number to 3000?

A: 

You could achieve this with a subquery...

SELECT name FROM (
   SELECT name FROM tblname LIMIT 0, 3000
) `Results` LIMIT 20, 40

Or with a temporary table, whereby you select all 3000 rows into a temp table then page by the temporary row id, which will be sequential.

Sohnee
+1  A: 

Firstly, the LIMIT paramters are Offset and number of records, so the second parameter should always be 20 - you don't need to increment this.

Surely if you know the upper limit of rows you want to retrieve, you can just put this into your logic which runs the query, i.e. check that Offset + Limit <= 3000

Tom Haigh
A: 

You can specify the limit as a function of the page number (20*p, 20*p+2) in your php code, and limit the value of the page number to 150.

David Rabinowitz
+1  A: 

As Sohnee said, or (depending on your requirements) you can get all the 3000 records by SQL and then use array_slice in php to get chunks of the array.

Residuum
+4  A: 

Program your PHP so that when it finds itself ready to issue a query that ends with LIMIT 3000, 20 or higher, it would just stop and don't issue the query.

Or I am missing something?

Update:

MySQL treats LIMIT clause nicely.

Unless you have SQL_CALC_FOUND_ROWS in your query, MySQL just stops parsing results, sorting etc. as soon as it finds enough records to satisfy your query.

When you have something like that:

SELECT  column 
FROM    table_name 
WHERE   userId=1 
        AND somethingelse='something else'
LIMIT 0, 20

, MySQL will fetch first 20 records that satisfy the criteria and stop.

Doesn't matter how many records match the criteria: 50 or 1,000,000, performance will be the same.

If you add an ORDER BY to your query and don't have an index, then MySQL will of course need to browse all the records to find out the first 20.

However, even in this case it will not sort all 10,000: it will have a "running window" of top 20 records and sort only within this window as soon as it finds a record with value large (or small) enough to get into the window.

This is much faster than sorting the whole myriad.

MySQL, however, is not good in pipelining recorsets. This means that this query:

SELECT  column 
FROM    (
        SELECT  column
        FROM    table_name 
        WHERE   userId=1 
                AND somethingelse='something else'
        LIMIT 3000
        )
LIMIT 0, 20

is worse performance-wise than the first one.

MySQL will fetch 3,000 records, cache them in a temporary table (or in memory) and apply the outer LIMIT only after that.

Quassnoi
A: 

Or you could get the 3000 records and them using jquery tabs, split the records on 20 per page.