I have a structure of about 10 tables. This structure works great for data entry. However, I also need to preform complex and fast searches on that dataset. There are three approaches to this I can think of:
Join all those tables in select. This is pretty slow and likely not a good approach. If it's relevant, database is Informix; I've looked into creating views hoping that they'd be more optimized, but testing shows that selects on views are even slower than a lot of joins. Maybe there is some way to make Informix pre-join tables and create indexes on those, but from what I've seen it's not likely. I've done some preliminary testing and it seems that view is even slower than joins, but maybe I'm missing some Informix options. Both joins and view are slower than the approach #2:
Single synthetic table which is updated periodically. This seems the right approach, especially since searches don't need to be on real-time data - actually, I can probably get away with updating synthetic table daily. Data size would be about 500k-1000k rows.
Memcached and similiar in-memory solutions. At the moment there is no such infrastructure in place, and this probably doesn't warrant implementing it, however this is something I'll look at once the synthetic table becomes too slow. Also, there are lots of search parameters and even first query has to be fast, so this approach will have to eagerly cache all data. Of course, I'll probably cache anything I can even with approaches 1 and 2.
I'd like your thoughts on this. Is there a magic bullet I'm missing? What have you used in similar situations?