views:

216

answers:

7

Hi,

Recently I've been doing quite a big project with php + mysql. And now I'm concerned about my mysql. What should I do to make my mysql as optimal as possible? Tell everything you know, I'll be really very grateful.

Second question, I use one mysql query per page load which takes information from mysql. It's quite a big query, because I take information from a few tables with a join. Maybe I should do something else?

Thank you.

+3  A: 

Learn to use the explain tool.

David Oneill
... and all about Database normalization
Gordon
Normalization is not necessarily the solution to all problems ;-) ;; neither is de-normalization, actually ^^ (If there was a magic solution that always work, we would be out of jobs, wouldn't we ? ^^ )
Pascal MARTIN
@Pascal true, but learning about when and when not to use it, will help him optimize his table structure for his **100%** optimal queries.
Gordon
@Gordon : I totally agree with the "when and when not to use it" :-)
Pascal MARTIN
+15  A: 

Some top tips from MySQL Performance tips forge

Specific Query Performance:

  1. Use EXPLAIN to profile the query execution plan
  2. Use Slow Query Log (always have it on!)
  3. Don't use DISTINCT when you have or could use GROUP BY Insert performance
  4. Batch INSERT and REPLACE
  5. Use LOAD DATA instead of INSERT
  6. LIMIT m,n may not be as fast as it sounds
  7. Don't use ORDER BY RAND() if you have > ~2K records
  8. Use SQL_NO_CACHE when you are SELECTing frequently updated data or large sets of data
  9. Avoid wildcards at the start of LIKE queries
  10. Avoid correlated subqueries and in select and where clause (try to avoid in)

Scaling Performance Tips:

  1. Use benchmarking
  2. isolate workloads don't let administrative work interfere with customer performance. (ie backups)
  3. Debugging sucks, testing rocks!
  4. As your data grows, indexing may change (cardinality and selectivity change). Structuring may want to change. Make your schema as modular as your code. Make your code able to scale. Plan and embrace change, and get developers to do the same.

Network Performance Tips:

  1. Minimize traffic by fetching only what you need. 1. Paging/chunked data retrieval to limit 2. Don't use SELECT * 3. Be wary of lots of small quick queries if a longer query can be more efficient
  2. Use multi_query if appropriate to reduce round-trips
  3. Use stored procedures to avoid bandwidth wastage

**OS Performance Tips: **

  1. Use proper data partitions 1. For Cluster. Start thinking about Cluster before you need them
  2. Keep the database host as clean as possible. Do you really need a windowing system on that server?
  3. Utilize the strengths of the OS
  4. pare down cron scripts
  5. create a test environment
Yada
nice answer to an incomplete question:)
Quamis
+2  A: 

Three things:

  1. Joins are not necessarily suboptimal. Oftentimes schemata that use joins will be faster than those that achieve the same but avoid table joins. The important thing is to know that your joins are optimal. EXPLAIN is very helpful but you also need to know how indexes work.

  2. If you're grabbing data from the DB on every page hit, consider if a cacheing system would work for you. If so, check out PHP memcache and memcached. It's easy to use in PHP and very fast. It's popular for a reason.

  3. Back to mysql: make sure you're key buffer is sized correctly. You can also think about using dedicated key buffers for critical indices that should remain in cache. Read about CACHE INDEX and LOAD INDEX INTO CACHE. See also here.

fsb
+1  A: 

"...because I take information from a few tables with a join"

Joins, even "big" joins aren't bad. Just be sure that you have good indexes.

Also note that performance with a couple of records is a lot different than performance with hundreds of thousands of records, so test accordingly.

Giovanni Galbo
So what can I do when I get hundreds or even thousands of record?
hey
Look into the explain plan, begin investigating other options, etc. My point is to just be cautious... Just because your code performs well under the (very) light load of development does not mean it will perform well in the real world.
Giovanni Galbo
A: 

For performance, this book is good: High Perofmanace MYSQL. The associated blog is good too.

fsb
A: 

my 2cents: set your log_slow_queries to <2sec and use mysqlsla (get it from hackmysql.com) to analyse the 'slow' queries... Thisway you can just drilldown into the slower queries as they come along...

(the mysqlsla can also benefit from the log-queries-not-using-indexes option)

  • on mysqlhack.com there's a script called 'mysqlreport' that gives estimates on how your installation is runnig... (once it's running a while) and also gives pointers as to where to tune your setup more precisely...
Gekkie
A: 

Being perfect is a bit of a challenge and not the first target to set yourself.

Enable mysql logging of all queries, and write some code which parses the log files and removes any literal values from the SQL statements.

e.g. changes

SELECT * FROM atable WHERE something=5 AND other='splodgy';

and

SELECT * FROM atable WHERE something=1 AND other='zippy';

to something like:

SELECT * FROM atable WHERE something=:1 AND other=:2;

(Sorry, I've not got my code which does this to hand - but it's not rocket science)

Then shove the re-written log into a table so you can prioritize your performance fixes based on length and frequency of execution.

C.

symcbean