views:

544

answers:

9

So.. my idea is to load a full manga/comics at once, with a progress bar included, and make sort of a stream, like:

  • My page loads the basic (HTML+CSS+JS) (of course)
  • As done, I start loading the imgs(the URLs are stored on JS var) from my server, one a time (or some faster way) so I can make a sort of progress bar.
  • ALTERNATIVE: Is there a way to load a compresses file with all imgs and uncompress at the browser?
  • ALTERNATIVE: I was also thinking of saving then as strings and then decode, they are mostly .jpg
  • The images don't have to show right away, i just need the callback when they are done.

XTML and HTML5 is acceptable

What is the fastest way to load a series of images for my website?

EDIT Since @Oded comment.. the question is truly what is the best tech for loading images and the user don't have to wait everytime is turns the 'page'. Targeting a more similar experience like when you read comics in real life.

EDIT2 As some people helped me realize, I'm looking for a pre-loader on steroids

EDIT3 No css techs will do

+4  A: 

ALTERNATIVE: Is there a way to load a compresses file with all imgs and uncompress at the browser?

Image formats are already compressed. You would gain nothing by stitching and trying to further compress them.

You can just stick the images together and use background-position to display different parts of them: this is called ‘spriting’. But spriting's mostly useful for smaller images, to cut down the number of HTTP requests to the server and somewhat reduce latency; for larger images like manga pages the benefit is not so large, possibly outweighed by the need to fetch one giant image all at once even if the user is only going to read the first few pages.

ALTERNATIVE: I was also thinking of saving then as strings and then decode

What would that achieve? Transferring as string would, in most cases, be considerably slower than raw binary. Then to get them from JavaScript strings into images you'd have to use data: URLs, which don't work in IE6-IE7, and are limited to how much data you can put in them. Again, this is meant primarily for small images.

I think all you really want is a bog-standard image preloader.

bobince
Yep agreed, for small images it is a good tech but not the case, since would overload a lot for no benefit, or even a browser crash since the img would be gigantic
Fabiano PS
I was looking for a preloader on steroids, maybe a SWF?My only concern is that i don't make use of a CDN, but I also don't wanna fall in the image load limit cap.
Fabiano PS
+1  A: 

As done, I start loading the imgs(the URLs are stored on JS var) from my server, one a time (or some faster way) so I can make a sort of progress bar.

Your browser already downloads the HTML first, that's how it knows to load any JS/images you reference. You are trying to invent something that already exists.

Just make sure your manga is made up of lots of images of a known size, which you specify in your img tags. Most browsers have some sort of progress bar to show that it's loading resources for you. You're not going to make loading large images faster unless you improve either the speed at which your server serves them, or your user's internet connection, or you compress them to make your image files smaller (likely at the cost of image quality).

Dominic Rodger
Ok, fair answer tnxs
Fabiano PS
+2  A: 

You could preload the images in javascript using:

var x = new Image();
x.src = "someurl";

This would work like the one you described as "saving the image in strings".

Engwan
+1  A: 
z5h
+1  A: 

Here's something you can try, which by happenstance I just coded up:

(function() {
  var imgs = [ "image1.png", "image2.png", ... /* all your image names */ ],
    index = 0,
    img;

  (function() {
    if (index >= imgs.length) return;
    (img = new Image()).onload = arguments.callee;
    setTimeout(function() { img.src = "/path/to/images/" + imgs[index++]; }, 1);
  })();
 })();

Plop all your image names (or the ones you want to preload) into the array, and make sure this script starts up when your page(s) start loading. It'll work its way through the list of images, loading them, and then moving on to the next one when each image finishes. (The setTimeout call is to make sure that the "onload" handler doesn't get called while you're still inside a handler.)

You'd probably want to do this for lots of the "nuts and bolts" images for your whole site - in other words, each page would try to load images for everything. Once they're in the cache, of course, this won't take a significant amount of time. Alternatively, you could run this script only on a couple pages, like "login" screens and the main "home" page. Of course, if you've got a site like Flickr, then you probably wouldn't want to preload all your images :-)

Pointy
+1  A: 

Image preloaders have been around for ages. You really do not need to load them all at once, you can do it on demand [when the person loads the next page, you can fetch the image after it]

epascarello
This is what I'd go for with a slight twist - load pages 1 and 2 to start with when the person turns to page 2, start loading page 3 in the background. That way you don't have to load the entire comic up front but each page turn will be quick as the next page is already loaded by the time the reader reaches the end of the current page.
Paolo
+2  A: 

Spriting

Just have a look how facebook does it: http://b.static.ak.fbcdn.net/rsrc.php/z3JQK/hash/11cngjg0.png

One image that loads FASTER than series of small images. To display the icon you simply create a div with fixed dimensions, and move the background inside it. Your div works as a viewport for the big image. You use background-position to move to appropriate part of the image. Everything else is hidden.

Different domains

Something you probably didn't know - Internet Explorer has a limit of connections per server. You can read about it here: http://support.microsoft.com/?scid=kb;en-us;183110&x=17&y=11 (here are exact numbers).

What it means - if user is using IE7, he will be able to load ONLY 4 (or 2) files at the same time from your server regardless his internet connection speed.

To speed things up, you could create few subdomains: server1.mydomain.com, server2.mydomain.com, server3.mydomain.com etc - and then user can download many files a lot quicker, because you use different hosts to serve different files.

rochal
Sprinting: CSS won't do for this.Yes i am aware of that IE cap, my ALTERNATIVE about strings and compactness had that in mind..
Fabiano PS
+1  A: 

If you split large images into smaller parts, they'll load faster on modern browsers due to pipelining.

alt text

Chris Dennett
The example here supposes that 3 pipelined images will be faster than 3 sequentially loaded images. That is true. But to suppose that 3 pipelined images will be faster than 1 image can be, at best, an assumption. In fact multiple 3-way handshakes over a long distance (high latency) link might cause the pipelined approach to take longer.
PP
Very good point, now we're talking!
Fabiano PS
Correct me if I'm wrong, but splitting a single image for pipelining would only add benefit if he were downloading a single large image? As is, he's already transferring several files so pipelining will occur regardless.
meagar
I think he found a solution via raw throughput, getting the manga images themselves to the user faster :) The problem with this approach however is that if the users want to save the files, they'll be in multiple bits and a pain to download. Anyway, what I'd suggest is combining this with Javascript (see http://elouai.com/javascript-preload-images.php) and load in the images for next / forward pages, getting the composite image URLs through your server-side comic script while generating the HTML.
Chris Dennett
A: 
  • My page loads the basic (HTML+CSS+JS) (of course)
  • As done, I start loading the imgs(the URLs are stored on JS var) from my server, one a time (or some faster way) so I can make a sort of progress bar.
  • The images don't have to show right away, i just need the callback when they are done.

If you want to load 10 images as fast as possible, place 10 <img> tags on the page, one for each image. Use Javascript to hide all the but the currently viewed image; add next/back links that use JS to hide the current image and show the next one. Many browser already have some form of progress bar, and by doing things with regular old HTML, it will function correctly.

You're trying to re-invent all this functionality with Javascript for no good reason. You're not going to do it better than the browser.

All that said, this is probably a bad idea. You might dump 15MB of comic pages into the browser window only to have the user leave after reading the first page. Rather than trying to pre-load all images, you should use JS to always keep the next page (or two) pre-loaded, not the entire thing.

meagar
Fabiano PS
I'm saying that your way will add complexity to no gain. You cannot use Javascript to make already compressed data travel down the pipes any faster.
meagar
I want to add complexity, yes, but to add gain, otherwise is pointless. See @Chris Dennett answer, that is the kind of stuff I look for, now I need to make it feasible
Fabiano PS