views:

49

answers:

3

I need to provide some kind of global search over most of the data that my application have. The data is distributed in different tables, like users, comments, etc. in MySQL.

I do want to handle this in the application, not with something like Google Custom Search.

My idea is to create table, which would have columns like source id and data with fulltext index on data, and then somehow collect all the data into the table.

Is there any better way to implement this? Preferrably with a gem or a plugin?

+3  A: 

I'd recommend investigating the Thinking Sphinx gem, which is a Ruby interface to the Sphinx full-text search engine.

Greg Campbell
A: 

When faced with a problem like this, I took to leveraging the power of MySQL's full-text search capabilities. What you can do is jam all your "searchable" content into a special-purpose MyISAM table that is compiled out of others.

Since maintaining dependent search-specific records is a pain, especially with Rails, it was easier to add triggers that handled the back-end replication rather than do it as an after_save or after_destroy type of call.

While this isn't a gem or a plugin, and introducing triggers completely confuses the Rails schema dumper, forcing a switch to .sql format instead of .rb, it works surprisingly well. Mostly it just "works", so you don't have to worry too much about it, although altering your schema substantially may require rebuilding your triggers from time to time.

The trigger concept is pretty straightforward, not unlike ActiveRecord callbacks: http://dev.mysql.com/doc/refman/5.0/en/create-trigger.html

This is an improvement on the technique employed by the Mediawiki software to provide a full-text searchable index of the wiki content stored, usually, in InnoDB.

tadman
+2  A: 

I've used acts_as_solr (http://acts-as-solr.rubyforge.org/) to add text search across various models in Rails projects before. You simply tag the models and propertiess you're interested in indexing and the plugin handles the rest.

Nate