tags:

views:

46

answers:

2

Quite simple really, in a php/mysql set up, I have to run around 2M point updates on 25M items (where one update covers multiple primary keys), is there a way to pull this off with the same sort of gains as extended inserts?

I'm using unbuffered queries already. Its slow.

NB: Searching came up with nothing, this is on a dev environment so anything goes as there won't be this issue in prod.. yay for real servers.

A: 

Have you tried prepared statements?

The query only needs to be parsed (or prepared) once, but can be executed multiple times with the same or different parameters. [...] By using a prepared statement the application avoids repeating the analyze/compile/optimize cycle. This means that prepared statements use fewer resources and thus run faster.
VolkerK
+1  A: 

One popular way to update a bunch of records is to create a new table, insert lots of records into that table (use extended inserts), and then swap out the old table with the new one.

Alex
Similarly, you can insert into a new table, as you outlined, and then update the existing table using a join.
Frank Farmer