views:

81

answers:

6

I have a table contain the city around the worlds it contain more than 70,000 cities.

and also have auto suggest input in my home page - which used intensively in my home page-, that make a sql query (like search) for each input in the input (after the second letter)..

so i afraid from that heavily load,,...,, so I looking for any solution or technique can help in such situation .

A: 

If you have an index on the the city name it should be handled by the database efficiently. This statement is wrong, see comments below

To lower the demands on your server resources you can offer autocompletion only after n more characters. Also allow for some timeout, i.e. don't do a request when a user is still typing.
Once the user stopped typing for a while you can request autocompletion.

Exception e
I used after 2nd leeter, icann't over this number. Also the idea of time out add addition latency to script that unacceptable, thanx man.
assaqqaf
No, an index will only improve performance for exact match and range queries, NOT substring matches (e.g. with LIKE '%ka%')
MarkR
A: 

You should cache as much data as you can on the web server. Data that does not change often like list of Countries, Cities, etc is a good candidate for this. Realistically, how often do you add a country? Even if you change the list, a simple refresh of the cache will handle this.

You should make sure that your queries are tuned properly to make best use of Index and Join techniques.

You may have load on your DB from other queries as well. You may want to look into techniques to improve performance of MySQL databases.

Raj More
+2  A: 

Cache the table, preferably in memory. 70.000 cities is not that much data. If each city takes up 50 bytes, that's only 70000 * 50 / (1024 ^ 2) = 3MByte. And after all, a list of cities doesn't change that fast.

If you are using AJAX calls exclusively, you could cache the data for every combination of the first two letters in JSON. Assuming a Latin-like alphabet, that would be around 680 combinations. Save each of those to a text file in JSON format, and have jQuery access the text files directly.

John
thanx, I like the idea of using JSON, is more efficient than others?
assaqqaf
I was assuming you are already using JSON-format messages, since that is the most common format used for AJAX data transfer. The JSON format is lean, can be parsed very fast in Javascript and has good support in jQuery. However any other format can be cached on the server in the same way. So if you prefer another format, go for it.
John
+2  A: 

Create an index on the city 'names' to begin with. This speeds up queries that look like:

SELECT name FROM cities WHERE name LIKE 'ka%'

Also try making your auto complete form a little 'lazy'. The more letters a user enters, lesser the number of records your database has to deal with.

Salman A
the problem is i use query like '%ka%'
assaqqaf
A: 

Just get your table to fit in memory, which should be trivial for 70k rows.

Then you can do a scan very easily. Maybe don't even use a sql database for this (as it doesn't change very often), just dump the cities into a text file and scan that. That'd definitely be better if you have many web servers but only one db server as each could keep its own copy of the file.

How many queries per second are you seeing peak? I can't imagine there being that many people typing city names in, even if it is a very busy site.

Also you could cache the individual responses (e.g. in memcached) if you get a good hit rate (e.g. because people tend to type the same things in)


Actually you could also probably precalculate the responses for all one-three letter combinations, that's only 26*26*26 (=17k) entries. As a four or more letter input must logically be a subset of one of those, you could then scan the appropriate one of the 17k entries.

MarkR