views:

206

answers:

2

Hello there,

I'm by far no JS developer (in fact, hardly developer at all :) but I wanted to try to write a small Greasemonkey script to hide few elements on a certain webpage. Once I've started dabbing at it, I've decided to use jQuery, just to make it easier. Everything went well, the script is working, but now that it's ready, I've started to wonder about the details.

As a part of the script, I need to find specific element and hide it, together with previous and next of its siblings. I've ended up with not that readable, but working line:

$('div#some-weird-box').prev().toggle(w).next().toggle(w).next().toggle(w);

My concern here is, that I'm removing three separate divs in three separate steps. This would cause browser to "repaint" the page three times, right? It's not a problem with three divs, it would probably start to matter when there are more elements. So, my question is - is there a way to tell browser "stop refreshing/redrawing the page"? If there is, I could use that, hide arbitrary number of elements, then ask browser to update the screen.

+2  A: 

You can use cloneNode. After cloning you can do all manipulations over the clone and than replace original node. This will prevent your content from blinking when display:none like J-P proposed.

Eldar Djafarov
+4  A: 

Javascript execution locks the browser and you will not see a repaint until your code execution is finished. You really have nothing to worry about with this.

I posted a good example of this on jsbin.com: http://jsbin.com/ecuku/edit

Update: Often it is suggested to modify nodes outside of the DOM because modifying the DOM does cause reflows (not repaints). Reflows are when the browser has to recalculate positions and sizes of elements on your page because something has changed. While your javascript execution can cause multiple reflows, it will only cause one repaint (when your code finishes). Those reflows can be a big performance hit, but for small numbers of DOM changes (e.g. your code only has 3) it probably isn't worth the work to do make your changes outside of the page. Also, cloning a node and modifying it outside of the page before inserting it back can have unexpected consequences. For example, if you had event handlers attached they would need to be reattached to the new nodes.

Prestaul
Interesting. Is there any good read about that you can point me to? If that's the case, why do I see all those advices to remove element from the tree, apply changes, then bring it back? http://www.slideshare.net/nzakas/writing-efficient-javascript - slide 77 or 88...?
yacoob
Thats a pretty bold blanket statement, You cant say that will be the case 100% of the time.
Allen
I didn't make a blanket statement. I made a statement pertaining to the code that he posted. He has nothing to worry about with this. He will only see one repaint in any browser, and when modifying 3 nodes he should not clone the node and modify it outside of the page.
Prestaul
@yacoob, I responded to your comment in my answer.
Prestaul
Thanks, much appreciated. I've found http://www.stubbornella.org/content/2009/03/27/reflows-repaints-css-performance-making-your-javascript-slow/ but I probably should look for even more information...
yacoob
@Allen: it's what all JavaScript engines I've ever used do, including V8 (Chrome) which some consider the latest/greatest in JS engine innovation. A repaint is the most expensive operation a browser does, so optimizing them by waiting until the execution context completes is the smartest thing to do. I would be surprised if a vendor released code that actually performed repaints in the middle of an execution context: pages would slow to a grind with every set to `style.foo = 'bar'` inside a loop.
Crescent Fresh