tags:

views:

35

answers:

3

When you have a slow database app, the first suggestions that people make is to:

  1. Track the slow queries
  2. Add appropriate indexes

In the case you are building your own application this is very logical, but when you use a CMS like Drupal, that are people have developed and tuned, is this approach valid?

I mean, aren't Drupal tables already fine-tuned for performance? Even if I try to see which queries are the slow ones, what could I do about it? Re-write Drupal core?!?

A: 

You will not typically be able to change the application, you will get to the next tier of optimization.

this typically includes an analysis of the server hardware and network and client machines

It may be useful to the Drupal guys for you to let them know you are having poor performance, and they may suggest a modification to some settign somewhere - at least they might elevate your performance issue in to something that gets in to a future release.

Randy
A: 

Typically you will use quite a lot of contributed modules with Drupal, which can vary significantly in quality. Most likely there will be some optimization potential there. The interplay of different modules can is some cases result in extremely slow queries, so checking the queries (preferably with the Devel module) is a good first step.

Drupal core should be well optimized, but Drupal is extremely flexible, I could imagine that if you use it in very different ways, the assumptions of the Drupal core developers are not valid anymore.

Fabian
A: 

Heavy use of CCK and Views will most certainly need you to play with mySQL indexes. But there are a lot of other areas of the whole foodchain from webserver, db, php, html, css etc, that need to be evaluated. I have found the rather fresh module dbtuner invaluable in my recent optimization efforts. I also recently found this nice article about optimizing Drupal

Tom