views:

28

answers:

1

Hi,

I'm using Zend_Search_Lucene for full-text search of records in several different tables in my application. I have just implemented this functionality, and currently the index is built upon first use of the search functionality after application deployment. This is obviously not what I would like in production.

I'm looking for an easy way to perform the indexing as part of the deployment logic of the application, and then perform an incremental index of added/updated records once every half an hour or so.

What should I be looking at to implement this?

Thanks for any and all help that you may be able to provide.

+1  A: 

Hi, No need of indexing as apart of the deployment. Here i am suggesting a solution.

Make a default directory in your application say searchdata. Inside this make separate paths to the different types of data (say for cities, a folder as city_index). Inside the config file (of your application) specify these paths. Write php scripts so that it will read the table data and make the index. These scripts uses the config paths to make the index. Write a master script which eill run all these scripts.

On your server run these scripts once. This onwards search will go on (use the config paths to search). (No indexing on first search)

If a new city has been added then in your code where you are adding that city(model function) with the db update update the index also. Also optimize the index file which will decrease the search time. This operation is not a frequent one, so updating on fly not a problem.

siva kiran
I like the idea of adding to the index upon each record write. The only problem is that I have some record types (such as comments, for example) that may be added a lot more frequently than others, such as users. Adding to the index and optimization hits the filesystem, so its pretty slow. Would you still recommend this approach?
ubermensch