Hello.
I am having problems with a Python script which is basically just analysing a CSV file line-by-line and then inserting each line into a MySQL table using a FOR loop:
f = csv.reader(open(filePath, "r"))
i = 1
for line in f:
if (i > skipLines):
vals = nullify(line)
try:
cursor.execute(query, vals)
except TypeError:
sys.exc_clear()
i += 1
return
Where the query is of the form:
query = ("insert into %s" % tableName) + (" values (%s)" % placeholders)
This is working perfectly fine with every file it is used for with one exception - the largest file. It stops at different points each time - sometimes it reaches 600,000 records, sometimes 900,000. But there are about 4,000,000 records in total.
I can't figure out why it is doing this. The table type is MyISAM. Plenty of disk space available. The table is reaching about 35MB when it stops. max_allowed_packet
is set to 16MB but I don't think is a problem as it is executing line-by-line?
Anyone have any ideas what this could be? Not sure whether it is Python, MySQL or the MySQLdb module that is responsible for this.
Thanks in advance.