My PHP app has an import script that can import records.
At the moment, it is importing from a CSV file. It is reading each line of the CSV file, one line at a time using fgetcsv, and for each line it is doing a lot of processing on that record, including database queries, and then moving on to the next line. It shouldn't need to keep accumulating more memory.
After around 2500 records imported, PHP dies, saying that it has run over its memory limit (132 MB or so).
The CSV file itself is only a couple of megs - the other processing that happens does a lot of string comparisons, diffs, etc. I have a huge amount of code operating on it and it would be difficult to come up with a 'smallest reproducing sample'.
What are some good ways to go about finding and fixing such a problem?
Cause of problem found
I have a debug class which logs all my database queries during runtime. So those strings of SQL, some 30KB long, were staying in memory. I realise this isn't suitable for scripts designed to run for a long time.
There may be other sources of memory leaks, but I am fairly sure this is the cause of my problem.