views:

114

answers:

4

Hi,

I have a pretty big web site (asp.net web) and I have used JavaScript intensively (jquery,custom javascipt, etc) and also The theme and CSS. Right now I have a huge amount of js and css files in my system and I am thinking about minifying and smashing the js and css to improve the performance. So, please advice me the suitable tools and technologies to be used and please suggest me the best practices to be applied in these scenario.

Thanks Thurein

+1  A: 

If you're using jQuery, you should already be using the minified version on your production site.

For your own Javascript, there's Douglas Crockford's original minifier, or you could use the Microsoft Ajax Minifier, which you can add as an MSBuild task - there's an overview here.

There's a number of CSS minifiers referenced in the answers to this SO question.

PhilPursglove
+2  A: 

Hi Thurein

There are a couple of things you could try to improve performance.

I've listed the techniques we have used at our organisation to improve the performance of our web applications. With the exception of GZIP the other two probably won't have a direct effect on the JS and CSS files but will save you valuable loading time in other areas making your web site much faster.

We always see huge increases in speed and performance when the retail attribute is set to 'True' - all forms of debugging are disabled in this mode which makes the site more responsive.

Obviously you only want this setting on your live server.

Dal
Thanks for all the imports, same time I am looking at the Yahoo compressor.
Thurein
For Retail, and if you are <2.0 then you can set Debug = False, as that is likely the biggest performance impact of the Retail setting.
Flory
+2  A: 

Steve Souders at Google has very informative talks on Youtube

What's interesting is the size isn't always the problem. Have a watch of his videos, it's well worth it.

v01d
+1  A: 

Best practices are great but, at least in the case of performance, the question is more about the right practice. The only way to know is to:

  • Measure
  • Try
  • Measure Again
  • Compare/Contrast

It is almost always a trade-off with optimizing software. You have to consider how a change effects readability, scalability, security, deployment complexity, and probably a lot of other ity's I am not thinking of.

Having said all that the practices you are looking for are pretty much here.

To make my point though consider the rule Use a CDN. This seems like an obvious choice for something universal like jquery. You can use both MSFT and Google for this. You have the potential advantage of parallel download (depending up on the client browser) and the added possibility of already being in the client cache (from other site they may have visited). Plus they are likely serving it closer to the user than you are.

But the trade off to consider is that it is going to eat some bandwidth so if you are building something that is sitting in the Intranet, don't do this. You also introduce the possibility of failure from an external source. You can work around that by adding more code.

Perhaps most importantly though is that using a CDN is in direct contrast to the first rule, reduce HTTP requests. If you can combine and minimize all your js and load it more efficiently locally then you should do so. How do you know?
You have to:

  • Measure
  • Try
  • Measure Again
  • Compare/Contrast

Have fun!

Flory
Thanks for the great advices, I am doing the same approach by using Yahoo Compressor and GZip for the optimization and YSlow and other profiling tools to measure and compare the results.
Thurein