Hi folks,
I'm using Python in order to save the data row by row... but this is extremely slow!
The CSV contains 70million lines, and with my script I can just store 1thousand a second.
This is what my script looks like
reader = csv.reader(open('test_results.csv', 'r'))
for row in reader:
TestResult(type=row[0], name=row[1], result=row[2]).save()
I reckon that for testing I might have to consider MySQL or PostgreSQL.
Any idea or tips? This is the first time I deal with such massive volumes of data. :)