views:

89

answers:

1

I'm loading data from my database and I'm doing a sum calculation with a group by.

ElectricityReading.sum(:electricity_value, :group => "electricity_timestamp", :having => ["electricity_timestamp = '2010-02-14 23:30:00'"])

My data sets are extremely large, 100k upwards so I was wondering if its possible to use the find_each to batch this to help with memory overhead.

I can write the batching manually use limit and offset I guess but I'd like to avoid that if the code already exists.

A: 

From http://railsforum.com/viewtopic.php?pid=88198#p88198

@categories = Categories.find(:all, :joins => :animals,
                          :select => "categories.*, SUM(animals.weight) as weight_sum",
                          :group => "categories.id")
# ATTENTION: weight_sum is now a temporary attribute of the categories returned!
# and the animals are NOT eager-loaded`
<% @categories.each do |c| %>
  Category: <%= c.name %><br />
  Sum of Weight in this category: <%= c.weight_sum %><br />
<% end %>

It isn't ActiveRecord.sum, but it should do the trick.

kwerle