tags:

views:

49

answers:

3

hi,

I have one table of 50k keywords and I am providing the auto-complete feature for these keywords based on count mechanism. But still getting the keywords takes time..

In what way would the database partitions have to be done for fast retrieving....

help me plz.....

+3  A: 

A table with 50k rows is very small. There should be no need (and benefit) to partition it.

You need to look at the query execution plan and your algorithm in general. Maybe you just need an index. Or an in-memory cache.

Thilo
+1  A: 

some thoughts:

  • 50k keywords is not that big a table, partitions won't help, a smart index might.
  • you might fare best by loading a suitable data structure into memory first
  • if the data is in the DB your auto-complete will likely be slow and unresponsive, as every keypress results in communications with the DB.
lexu
as a part i want to increase the keywords to 50k to 500k the what partitioning gives the fast resoponse....thanks,Murali
murali
@murali: In that case you should come up with an algorithmy that queries the DB only after 1 (or 2 or more) characters have been entered, thus reducing the data needed to retrieve .. But 500k still isn't a big table, assuming an average of 20Bytes in memory per entry you would still 'only' need ca 100MB of ram if you prefetch the entire table.
lexu
A: 

Perhaps old table statistics, optimizer can choose wrong Plan.

Try from user with DBA role
exec dbms_stats.gather_table_stats (ownname => 'YOUR_OWNER', tabname => 'YOUR_TABLE');
alter system flush shared_pool;

And test time of getting the keywords again.

P.S. The statistics should be gathered regularly.

dba.in.ua