views:

44

answers:

4

Hi!

I'm working on a high traffic web application that uses increasingly more JavaScript-based plugins. Improving and keeping front-end performance at high levels is a great concern for me. I want to rearchitect the way plugins and specific plugin configurations are included in the page.

Currently some plugins are merged into a monolithic plugins.js file while others are loaded as individual files. Configurations are made in inline script blocks inside the template files. I believe that plugin management is getting out of hand affecting page load time and performance.

I thought of the following approach: separate plugins in individual files, then write page-specific configurations (like assigning event handlers or attaching tags) also in separate files. Then I intend to have a script, probably based on Sprockets, that runs at deployment time and merges multiple plugins with their configuration files then outputs a single file.

My input would be:

  • plugin1.js
  • plugin1.conf.js
  • plugin2.js
  • plugin2.conf.js

The output would include all the required files for a specific page.

  • page1.js = plugin1.js + plugin1.conf.js + ...

I believe this is a good approach considering that more people are editing the configurations and constantly adding more plugins.

Of course, I intend to use a map to match required plugins with specific templates in order to avoid downloading more JavaScript than necessary.

Is this a sound approach? Do you see any issues with it?

Has anyone else tackled this kind of problem?

+1  A: 

Be careful with browser caching of page1.js, either name it dynamically at packing e.g. page1-1002231846.js or make sure to invalidate caches with correct headers.

Don't make too many variants, rather generalize, browser caching will probably save more time than a few less Kb to load.

Identify scripts and modules that are common to most pages and bake them together in common.js

Put as much javascript as possible at the bottom of the page, helps rendering time a lot on javascript heavy sites - this defers them until everything else is setup, resulting in a snappier DOM and better user experience at loading.

Use Chrome Speed Tracer to figure out rendering bottlenecks.

Xipe
+1  A: 

Short answer

Measure the current site, and identify the bottlenecks.

Long answer

Hopefully some people with experience will chime in with more useful answers, but:

As some yahoo called Jeff Atwood mentioned in his blog post The Computer Performance Shell Game, performance opimisation is figuring out which resource is the bottleneck, and eliminating the bottlneck. The resources are:

  • Processor
  • Memory
  • Disk
  • Network

By producing different JavaScript file configurations for each page, I guess you’re looking to save on:

  1. Processor (less JavaScript to run on each page)
  2. Network (less JavaScript to download for each page)

On the second issue, you might be better served by having all your JavaScript for the entire site in one big file.

  • The file can be downloaded once and then cached, rather than different files being downloaded for each page
  • One big file might compress better via minification and gzipping

However, if a lot of users only hit one page of your web application (which uses a specific subset of JavaScript) and don’t hit the other pages, your configuration idea might work well.

As far as processing speed goes, I’d imagine smaller files would process faster than larger ones. You’ll want to measure this, and see how much benefit you get. You might be able to get much the same results by making each plugin responsible for checking if it’s needed on a given page (e.g. by checking for the presence of specific HTML on the page), and only running if so.

Paul D. Waite
Based on the number of separate views / potential workflows a user might have compressing all content into one file is not really helpful in my case. I risk getting users to wait too long for content they'll never use in their browsing session.
Razvan Caliman
Gotcha. Although, you say “might have” and “risk” — as you’re working on an existing site, you can figure these things out. How big would that file be? What connection speeds do your users have? Nothing like laying out the details in front of you to help make a decision.
Paul D. Waite
A: 

I basically combine it all at build time to

  • js3rdparty.js
  • js3rdpartyplugins.js
  • configurations.js
  • bootstrapper.js

js3rdparty.js contains say jQuery and ExtJs. This is very unlikely to change much over time and has a considerable size. js3rdpartyplugins.js are all plugins / extension I plucked of the internet or written myself this file is considerably smaller and is somewhat more prone to change. I just load up all my configurations in configurations.js for all my pages at once. In my application this comes down to loading up grid and form field configurations but once packed and minified this file is very small. bootstrapper.js is basically a combined script of all the site's behaviour (event listeners, creating controls, animations etcetera).

The main reason for me to split it up is to allow the browser to cache the big files that are least likely to change. When I roll out new behaviour or change some configuration they wont invalidate the cache for the bigger files.

Martijn Laarman
A: 

Can you try this? It works just fine with plugins, threads, etc.

Mike Dunlavey