views:

217

answers:

3

I have product website. On one page I show thumbnails and a brief description of all the products. When you click on the photos, you get to a detailed product page.

Is there a way to get the browser to start loading and caching the javascript and CSS for the "detailed product" page while the user is just looking at the "all the products" page and trying to make a choice?

I want this preloading and caching to start only once the page has fully loaded as to not slow it down.

Any suggestions on how to implement this?

+3  A: 

If you're using a JavaScript framework (like jQuery, protype, etc) then you can use a simple method to do an AJAX call. If not you'll have to write one which might be a bit confusing for someone that isn't familiar with JavaScript. A basic example is here.

You can use JavaScript to add script tags to your html page and it will include JS. Remember that if the JS is set to auto execute any code it will happen. For CSS, your only option is probably using JavaScript to send a request to grab the file (see above). You could include the CSS but it will override any styles from your original CSS file.

Websites that precache:
Websites including sites as big as Google and Yahoo use preaching to help performance. Google for instances loads a CSS sprite http://www.google.com/images/nav%5Flogo7.png on their main page along with other CSS and JS files that are not completely used on the main page alone. Most people already do something similar to this by just combining their CSS and JS files into one file in production. HTTP requests take more time than downloading the actual content. An example of Yahoo preaching is here

Yahoo talks about this on YSlow's help here.

Taken from one part of the guidelines here:
80% of the end-user response time is spent on the front-end. Most of this time is tied up in downloading all the components in the page: images, stylesheets, scripts, Flash, etc. Reducing the number of components in turn reduces the number of HTTP requests required to render the page. This is the key to faster pages.

Organization in development, speed in production:
What I usually try to do is in development I will split up my JS files if needed (hardly ever my CSS though). When its time to push this data to production servers, I run a compiler (simple script that combines all the files, and minifies them) and then put them online.

Minifying/compressing:
Remember HTTP requests are evil. A compressed JavaScript file and a compressed CSS file are so small, that I'm almost 100% sure there is an image on your main page that is smaller than it. Therefor it's pointless to worry about splitting them up per page. It's actually more of a performance hog to split them up across multiple pages.

CSS Sprites
The point in CSS sprites is a website probably has 40+ images on their page using CSS. Well thats 40+ HTTP requests on a users page load, thats A LOT of requests. Not only is that bad for the user, but thats also a lot of requests your web server is having to handle. If you aren't using a static content server and are just using Apache that is on your main host, you're poor Apache server is getting loaded with requests it could be serving for your web application. You can reduce this by combing your images into one file, or at least into fewer files. Using CSS's background-position property, you can do wonders.

I highly recommend reading the YSlow guidelines by Yahoo here: http://developer.yahoo.com/yslow/help/#guidelines

William
I don't see where google is doing that. Could you point out the code fragment that starts loading CSS and images?
Eric J.
Here is one example, this is just a css sprite but is still the same concept. http://www.google.com/images/nav_logo7.png
William
I just viewed the source of google.com and don't see this practice in action. Then again, I'm not a css/javascript expert. Could you point out what part of google.com is doing this?
Eric J.
Are you talking about the image that I posted? Or where they're using JavaScript to cache CSS/JS? They don't, like most websites they usually combine most of their data into one file. HTTP requests are worse than the overall size of JS/CSS. I can't imagine a reason for most websites hwy they'd split up the CSS/JS like he's doing.
William
http://ajaxian.com/archives/yahoo-search-contextual-precaching
William
William's link shows that Yahoo pre-caches the results for the next page *after* the user has shown likely intent to visit that next page, not just in case.
Eric J.
Yes you're right, but it's the same thing in his case. People don't go to a store if they don't plan on looking at at least ONE product. I'm sure there is a VERY high percentage of users that click at least one item vs none. It could be something like 80%. It's up to him to decide if he wants to use it. Plus he's only wanting to cache css / js, which he should probably just combine the files.
William
While I understand the ethical concern some people are having, I would rather focus on the technical point of view.As William suggested, my statistics show that a very high percentage of visitors will in fact look at least one product details.The detailed pages are a little heavy with often 24 photos, a map, sometimes a video and lots of js. I'm using sprites for the site graphics and gzip and minifying for the css and js.I'm in the process of trying to optimize everything I can and I feel like every little bit can help.
Enkay
The site uses JQuery. How would you make an ajax calls that runs after everything is done loading to preload a css file? I don't want the css file to be applied to the current page as I'm pretty sure right now it would break the design.
Enkay
@Enkay it should be as simple as using jQuery get(). The browser should cache the page based on the headers your server is giving out. Remember, that a browser will still make an HTTP request to see if the content has changed. If you want to disable this, then you should set the expires header on the file to say not to check. (Example on the YSlow example)
William
+1  A: 

use setTimeout in the load event of the page, and set a timeout of a few seconds, after that, insert a script tag and a css tag into page (those ones from the next page)

something like this: (where url is the url of the thing you want to cache)

    //cache a script
    var scriptTag = document.createElement("script");
    scriptTag.setAttribute("type", "text/javascript");
    scriptTag.setAttribute("src", url);
    document.getElementsByTagName("head")[0].appendChild(scriptTag);

    //cache an image:
    var img = new Image(); img.src = url;

    //cache a css
    var css= document.createElement("style");
    css.setAttribute("type", "text/css");
    css.setAttribute("src", url);
    document.getElementsByTagName("head")[0].appendChild(css);
mkoryak
Instead of using a setTimeout you could just create an event to wait for the page to finish loading, once it's done loading all resources then call that.
William
i am setting the timeout because i assume that some other code may want to run before this code is called. if he has control over what code runs when, then he can put this code at the end
mkoryak
Thanks, I will be giving this a try.
Enkay
+2  A: 

Theoretically you can start accessing resources from subsequent pages so that they are later available in the cache.

However, this is not good practice - especially if you are loading resources for all detail pages they may select. In doing so, you make the assumption that you should determine how the user's bandwidth is used, not them. If they are browsing multiple things at the same time, or doing other things with their bandwidth besides viewing your website, you are using their bandwidth in a manner they do not intend.

If their connection is slow enough that the load time for your detail pages needs to be optimized, chances are their connection is slow enough that they will feel the loss if they are doing other things at the same time.

Eric J.
Almost every major website has been using this technique for awhile. For instance, one of the high traffic websites I run we have every single main image on the site in one image file. This image file is only 9kb but that was because of how well it was created. This is known as a CSS sprite and almost every major website uses (yahoo, google, etc). This allows you to cache images and reduce HTTP requests. Google starts caching pages on the search page the second you go to their main website. Honestly, most people combine their CS file into one file so it's not much of a difference.
William
Constructing the website so that parts can be cached and reused is good practice. Downloading resources to the cache that may possibly be used if the user decides to go further into the website is bad practice. Whether or not some major sites pre-load content that I may or may not need doesn't make it a good (ethical) practice.
Eric J.
I'm not saying that he should cache the entire website all at once. Optimization is key when building a website, especially when building a site thats traffic. Each HTTP request costs processing power, so limiting the HTTP requests not only help the website, but also increases users performance. CSS/JS is so small! One image should be bigger than your compressed css/js files.
William
How does creating an HTTP request to download the CSS/JS for a possible next page reduce the overall amount of HTTP requests? The way you outline this, you certainly create an HTTP request to load the CSS/JS resources asynchronously. If you don't pre-cache those resources, you *might* end up making the same HTTP request the first time you really need that resource.
Eric J.
@William: CSS Sprites are a good thing. They reduce the overall amount of HTTP requests and help everyone. That's not the same thing, though, as pre-fetching resources for the next page that I don't need on the current page. Your link to ajaxian.com spells out that Yahoo fetches sprites *after* the user starts to type in a search term.
Eric J.
I was pointing out about css sprites and how most people combine their css and js files into one. (I said CS not CSS file, my mistake). Obviously for what he is asking, it doesn't help the HTTP request.
William
I already use sprites as much as possible, gzip, compressing and I'm working on limiting the http requests. As I mentionned in a comment to the other reply, the detailed products page is a little heavy so I'm trying to see if I can use the 20 seconds clients usually take to pick a product to start preloading content from the next page.
Enkay