- lets assume you have a table with 1M rows and growing ...
- every five minutes of every day you run a python programm which have to update some fields of 50K rows
my question is: what is the fastest way to do the work?
- runs those updates in loop and after last one is executed than fire up a cursor commit?
- or generate file and than run it throught command line?
- create temp table by huge and fast insert and than run a single update to production table?
- do prepared statements?
- split it up to 1K updates per execute, to generate smaller logs files?
- turn off logging while running update?
- or do a cases in mysql examples (but this works only up to 255 rows)
i dont know ... have anyone do something like this? what is the best practise? i need to run it as fast as possible ...