views:

79

answers:

2

Hello all,

I have a very large dataset (100,000) to be display, but any browser I tried that on including chrome 5 dev, it make them choke for dozens of seconds (win7 64bit, 4gb, 256gb ssd, c2duo 2.4ghertz). I did a little experiment by

some_controller.rb

def show @data = (1..100000).to_a end

show.html.erb

<% @data.each do |d| %> <%= d.to_s %> <% end%>

as simple as that it chokes the browsers. I know browsers were never built for this, so I thought to let the data come in chunk by chunk, I guess 2000 per chunk is reasonable, but I wouldn't want to make 50 requests each time this view is called, any ideas? It doesn't have to be chunk by chunk if it can be sent all at once.

Best,

A: 

One way to accomplish this would be to use render, passing it a proc. The snippet of code below is from the Rails documentation.

# Streams about 180 MB of generated data to the browser.
render :text => proc { |response, output|
  10_000_000.times do |i|
    output.write("This is line #{i}\n")
    output.flush
  end
}

http://api.rubyonrails.org/classes/ActionController/Base.html#M000658

Alex
Thanks for your answer. For some reason, this actually cause firefox to crash (the send report to mozilla thing came up), whereas my original method makes it go 'white-screened' but it comes back eventually.
Nik
+1  A: 

I did see a plugin that allowed this sort of thing, but I can't remember what it was called .... will keep trawling my notes.

In the meantime, this is what pagination is for :p

If you use-case doesn't support that, perhaps using AJAX would streamline things ... can use AJAX to load out the data incrementally, either as the user scrolls or automatically.

UPDATE: found the template streaming plugin: http://github.com/oggy/template_streaming

Toby Hede
Thanks for the suggestion. Yeah, I watched Ryan Bates's "Endless Page" episodes, too. -- I will have to test out how much my server can handle if the ajax requests become too much...
Nik
With that many records, your server is probably already spending significant time rendering. Some judicious AJAX can actually reduce load as you can have smaller requests that complete in less time (and can be themselves cacheable). There will be a sweet spot between too many small requests and too few large ones ... best advice is to measure and see.
Toby Hede
Couldn't agree more, I am sort of counting on this after having used Google Reader for a couple of years now. I am guess with 100,000 records, perhaps 1000 ~ 2000 each request wouldn't be so bad, i.e., 100 ~ 50 requests per loading.Just out of curiousity, there aren't any compression mechanism on server side that can be used to compress these html output then let the browser decompress it, right?
Nik
Gzip - you can setup your server to gzip the output. Handled at the server level, which means no impact on your rails app.
Toby Hede
Thanks Toby, I'll look into Gzip then.
Nik