views:

82

answers:

2

I'm looking at the product Aptimize Website Accelerator, which is an ISAPI filter that will concatenate files, minify css/javascript, and more. Does anyone have experience with this product, or any other "all-in-one" solutions?

I'm interesting in knowing whether something like this would be good long-term, or would manually setting up all the components (integrate YUICompress into the build process, setting up gzip compression, tweaking expiration headers, etc...) be more beneficial?

An all-in-one solution like this looks very tempting, as it could save a lot of time if our website is "less than optimal". But how efficient are these products? Would setting up the components manually generate better results? Or would the gap between the all-in-one solution and manually setting up the component be so small, that it's negligible?

+2  A: 

To get started I think you will find an all in one pretty good. But as time goes by if your site has a degree of complexity to it I think you could find that you want more control.

For instance, I use certain JS files that can't be minified because of the format of the javascript within it (note this is a 3rd party file that I don't want to change). Hence having control over what is 'handled' is extremely important.

Another point in case, we use tend to combine our JS files using a script manager which produces ashx pages. As far as I know not all 'all in one solutions' cater for this. Also the script manager already alters the headers in accordance with the cache settings that I configure for it.

Another example with compression is that if you have a lot of traffic depending on the type and size of the content being severed, you want to use different levels of compression. I think I remember a post by Jeff Atwood talking about what they went through for SO and the fact that they needed a far amount of control over the compression.

Another example is caching. Particularity if you want to get into donut or donut hole caching, this has nothing to do IIS and is all about how the all can cache different parts of the page - like it could cache the master page but not the dynamic content of the page itself... afaik no 'all in one solutions' cater for this.

In summary I tend to find I need more control over what is happening and I prefer any minification to occur during the build process (as you know exactly what you are getting when ever you run your app). Compression happens at the IIS level but I like being able to control it at an APP level. Any cases for caching or compression I simply control via the web config which allows for a lot of control. So I guess go all in one if you have something simple but if you need more control then you will need to do it in the app anyway.

Hope that helps.

UPDATE:

Just had a thought, setting all of these things up at an app level is really easy if you using ASP.net MVC but it can be a lot harder to maniplate headers etc in classic ASP.net. Hence in some cases if you are dealing with an older app using classic ASP.net it may be easier to use an all in one solution.

anthonyv
A: 

Hello, Please consider Helicon Ape http://www.helicontech.com/ape. As all-in-one solution it includes a bunch of swell features like URL rewriting, SEO optimization, proxy, GZIP compression, disk/memory caching, website/server protection etc. You may easily give it a try because Ape’s got free no-feature-cut license for 3 websites.

Slava
Skimming over the page, it doesn't look like this is what I'm looking for. I didn't see anything about automatic minification of css/js files and combining files to reduce requests.
nivlam
Yes, Ape doesn’t have such features yet. However there are content compression and caching features which were mentioned by you.
Slava