views:

470

answers:

6

I was working on this carousel thing, using the iCarousel. It works pretty nicely with a few images, using MooTools. However when I added 70 pictures (around 30k each) it stopped working so nicely. Having spent some time poking around in the code of iCarousel it looked pretty decent and stable. So this got me thinking: is the problem inherent to the script (which animates divs inside an overflow:hidden div), MooTools, Firefox on Ubuntu or that JavaScript cannot handle too much? If so, how much is too much?

I guess it's hard to say, but it would be really decent to know when JavaScript will become sluggish and unusable, preferably before starting to develop.

+12  A: 

Looking at the sample code, I noticed something like this being done:

 $("thumb0").addEvent("click", function(event){new Event(event).stop();ex6Carousel.goTo(0)});  
 $("thumb1").addEvent("click", function(event){new Event(event).stop();ex6Carousel.goTo(1)});  
 $("thumb2").addEvent("click", function(event){new Event(event).stop();ex6Carousel.goTo(2)});  
 $("thumb3").addEvent("click", function(event){new Event(event).stop();ex6Carousel.goTo(3)});  
 $("thumb4").addEvent("click", function(event){new Event(event).stop();ex6Carousel.goTo(4)});  
 $("thumb5").addEvent("click", function(event){new Event(event).stop();ex6Carousel.goTo(5)});  
 $("thumb6").addEvent("click", function(event){new Event(event).stop();ex6Carousel.goTo(6)});

If you have 70 images (and thus 70 thumbnails, I'm assuming) this would be wiring 70 events to the DOM. This is really not necessary and is most likely the cause of the "sluggish" behavior you are observing. I suggest you read up on event delegation and why it is a good thing. :) If you are using jQuery, this is easily done by using the live function, I would imagine mootools would have something similar. If it doesn't, though, it is fairly easy to rig up. I have discussed event delegation in the past here and here.

Paolo Bergantino
Great answer. Yes, this would be a great time to use jQuery's "live" events.
Nosredna
This question pokes in my head. Live is event delegation in JQ so that it will make one binding, right? How about $(".thumbs").click() ,if there's 100 thumbs, will it bind 100 times?
xandy
@xandy: Doing $(selector).click(); will make jQuery find every single element that matches selector and create an event for it. So, yes, if there's 100 elements with a class of .thumbs, $('.thumbs').click() will bind 100 times, which is why live() is so very awesome (although it gets a lot more rep for simply being able to bind to non-existent elements, I use it mostly for event delegation)
Paolo Bergantino
You could also just bind to the parent element and use event bubbling. That way you only need 1 event handler. This is kinda like the `live` event but with a bit flexibility. I once reduced the running time of a script from ~20 secs to ~3 secs just by using event bubbling instead of per element event bindings.
cdmckay
cdmckay: That's exactly what event delegation is :)
Paolo Bergantino
Removing the bindings using $$ for a general event delegation solved the sluggishness. $$("img.thumb").addEvent("click", function ( event ) {...I also discovered some other really unnecessary stuff, like duplication of the animating layer i.e. 70 pictures becoming 140...So the moral of the story is binding a lot of events to the DOM is not a good idea.
Reed Richards
A: 

It depends a lot on the browser you use. A 50-line form could kill Internet Explorer and at the same time be very usable with Opera or Firefox. So scalling can be very browser-dependant.

anno
+3  A: 

Javascript, as a language, scales fine. As well as any other turing-complete language, in fact.

However, the environment in which Javascript is most-often run, the web browser, is an unwieldy and unpredictable place, and your code may have to run on anything from a Pentium III running IE6 on a 56K in Cambodia to a quad-core EE running Firefox 3.5 patched directly into an OC-12. The Javascript interpreters embedded in browsers are drastically different in speed.

Writing performant Javascript is really more a process of attempting to exploit the predictable strengths of the browsers your users are likely to be on, or at least avoiding their known weaknesses. It's about creating the illusion of speed by rendering and loading pages progressively, and making intelligent use of AJAX to aid responsiveness.

Scalability is helplessly tied to the specific system on which your Javascript is running, more than the language itself.

Besides - you don't really have a choice! Javascript is one standard you should expect to be around for a long, long time no matter what it's shortcomings.

btw - What didn't "scale" in your example was your bandwidth. Downloading 2.1MB of images is going to slow the carousel down, no matter how fast the Javascript is.

Triptych
+4  A: 

In addition to the other answers I will point out from hard experience that manipulating dom elements in the browser to animate them is really really slow compared to a direct drawing API like the canvas tag provides. There's a lot of overhead not only in the bridge between the javascript and the dom, but also in the browser running a content reflow routine every time some attribute is changed.

There's ways around some of this such as ensuring that your animated eliments have display:absolute set in their css (Thus taking them out of the document flow, eliminating the need to reflow) and ensuring in your code that you only touch the dom when something actually needs to change.

A dom document fragment can be manipulated, and then inserted into the dom, and that's much faster than manipulating a part of the dom that is displayed on the screen.

And also that event handling stuff other people mentioned.

Javascript scaling? Yes, yes it can. The browser DOM scaling? No perhaps not. Maybe future browsers will be better at this.

Breton
A: 

doing it line by line with explicit events and params is just silly. you should consider refactoring it a little. remove the explicit IDs, you should setup some class-based handler or at the very least, target the parent id and find the images to attach to as children, something like this:

<div id="example_6_frame">
 <ul>
  <li><a href="#"><img src="images/ex6_1t.jpg" alt="thumbnail 1" /></a></li>
  <li><a href="#"><img src="images/ex6_2t.jpg" alt="thumbnail 2" /></a></li>
  <li><a href="#"><img src="images/ex6_3t.jpg" alt="thumbnail 3" /></a></li>
  <li><a href="#"><img src="images/ex6_4t.jpg" alt="thumbnail 4" /></a></li>
  <li><a href="#"><img src="images/ex6_5t.jpg" alt="thumbnail 5" /></a></li>
  <li><a href="#"><img src="images/ex6_6t.jpg" alt="thumbnail 6" /></a></li>
  <li><a href="#"><img src="images/ex6_7t.jpg" alt="thumbnail 7" /></a></li>
 </ul>
</div>

the js bits...

$("example_6_frame").getElements("img").each(function(el, i) {
    el.addEvents({
        click: function(e) {
            ex6Carousel.goTo(i);
        } 
    }); 
});

here's also a mootools liveEvent implementation, http://dev.k1der.net/dev/live-events-pour-moootools/ - you can adapt it easily to the above...

Element.implement({
    addLiveEvent: function(event, selector, fn){
     this.addEvent(event, function(e){
      var t = $(e.target);
      if (!t.match(selector)) return false;
      fn.apply(t, [e]);
     }.bindWithEvent(this, selector, fn));
    }
});

just keep in mind that if the script uses the mootools scroll.js to find and scroll to elements, you may have to leave the specific ids in after all. having said that, this iCarousel is from the same bloke that wrote iMask so it ought to be well written...

Dimitar Christoff
A: 

JavaScript scales horizontally quite well through XHR. What you have is a particular application that does not scale well. There are certain parts of JavaScript in the browser that do slow things down. In this case, the number of events that are being handled.

James M.