views:

245

answers:

1

I have a controller in Rails that generates CSV reports using FasterCSV. These reports will contain approximately 20,000 rows, maybe more.

It takes about 30 seconds or more when creating the csv_string in my implementation below. Is there a better/faster way to export the data? Any way to output the data without having to store it all in memory in the csv_string?

My current implementation is as follows:

@report_data = Person.find(:all, :conditions => "age > #{params[:age]}")
csv_string = FasterCSV.generate do |csv|
    @report_data.each do |e|
        values = [e.id, e.name, e.description, e.phone, e.address]
        csv << values
    end
end
send_data csv_string, :type => "text/plain", 
    :filename=>"report.csv", :disposition => 'attachment'
A: 

I would try using find_in_batches, to eliminate that many ActiveRecord objects in memory at once.

http://ryandaigle.com/articles/2009/2/23/what-s-new-in-edge-rails-batched-find

I believe that should help quite a bit, creating and having many ActiveRecord objects in memory is slooowww.

Ben