I'm using FasterCSV to import an uploaded file to a model, and it's working great for small files. However when I try to import a large dataset (21,000 lines) it takes ages and I get browser timeouts on the live server.
This is my current working code:
logcount=0
Attendee.transaction do
FCSV.new(file, :headers => true).each do |row|
row[1] = Date.strptime(row[1], '%m/%d/%Y')
record = @event.attendees.new(:union_id => row[0], :dob => row[1], :gender => row[2])
if record.save
logcount += 1
end
end
end
I'd love to use a background process, but the user needs to see how many lines were imported before they can move to the next step of the system.
So, I was thinking that I should use action chunking and only read a smaller number of lines, set a counter, then update the view with some kind of progress, then run the method again using the previous counter as start point.
I can't seem to see how to get FasterCSV to read only a set number of lines, and also set an offset for the start point.
Does anyone know how to do this? Or is there a better way to handle this?