+4  A: 

10,000 records is big but not massive in MySQL terms and the table is simple enough that I don't think you need any optimisation. If the data in the table is reliable and your .csv is always well formed then there's not a lot to go wrong.

The separate issue is whether your import process is throwing errors. If there is even the remotest chance that the .csv could contain incorrect column references, lost commas etc then your idea to test everything in a temp table is certainly a good one.

The only other things I can thing of are (in order of neuroticism)

  • Perform this operation overnight or whenever your site is unused
  • Have the PHP script catch errors and email you the results of each run
  • Have the script backup the table, run the .csv, check for errors and if errors then email you and simultaneously restore the backup

Hope any of that helps!

hfidgen
Ahhh yes, I need to add bullets 2 and 3 from your list. Thanks, that helped a bunch.
picxelplay