views:

52

answers:

2

I would need some help on how to optimize the query.

select * from transaction where id < 7500001 order by id desc limit 16

when i do an explain plan on this - the type is "range" and rows is "7500000" According to the some online reference's this is explained as, it took the query 7,500,000 rows to scan and get the data.

Is there any way i can optimize so it uses less rows to scan and get the data. Also, id is the primary key column.

+1  A: 

online reference's this is explained as, it took the query 7,500,000 rows to scan and get the data

not actually. it's the approximate (optimizer cannot say the correct number in many different cases) number of rows that potentially will be scanned. but you specified LIMIT - so only first 16 rows will be affected while query executed.

ps: i hope the used key in EXPLAIN is id?

zerkms
The used key in EXPLAIN should be `PRIMARY`
newtover
yep. actually it changes nothing. whether key (contains `id` as it leftmost part) is used - this query would be fast.
zerkms
A: 

I performed an explain with your query on a 8 million rows table

id   select_type  table       type  possible_keys  key      key_len  ref    rows     Extra
1   SIMPLE       transaction range PRIMARY        PRIMARY  8        NULL   4079100  Using where

The actual execution was fast, Execution Time : 00:00:00:044.

ceteras
Its not about time, but about the no. of rows it read to get the data. that's the only concern.
vamsivanka
@vamsivanka: have you read my answer?
zerkms