views:

278

answers:

3

I'm a newbie who is creating a lightweight photo showcase site written on the cake php framework with RoR.i plan to use effects from the scriptalicious library, as well as jquery for photo display & transition effects.

As the site will be very photo-rich, what programming steps can i take to ensure that all photos and other pages load quickly?

A: 

Reduce number of libraries, are you sure you want to use jquery + scriptalicious. Stick to simple things, don't look for complex animations.

Fast loading => Caching, pages with photos are good candidates for cache.

If what you are worried is about user speed feeling, you might want to preload images in the background in anticipation to user actions but think that this might increase your server load. Do this only if you have good bandwidth contracted.

If you are able to produce a thumbs front page which is quite static, say changes twice a day, you can use sprite technique to reduce latency loading many thumbs, see:

http://websitetips.com/articles/css/sprites/

Miquel
thank you miquel, i really appreciate your expertise :-)
argruge e
+2  A: 

The question is pretty vague. Most of the time spent getting a page is usually getting static content. Here are some rules of thumb for speeding up load times independent of language or framework:

  1. Install the YSlow plug-in for firefox
  2. Use CSS Sprites
  3. Have a light weight http server for static content, nginx or lighttpd
  4. Serve static content on a different domain or sub domain, this allows more simultaneous http requests
  5. Minify javascript and css
  6. Cache pages as much as you can
  7. Keep the number of http requests low
  8. Run pngcrush or jpegtran on your images

Naturally, this is just the tip of the iceberg. These are good first steps.

Buddy
thank you very much
argruge e
+5  A: 

i think you're mixing things up a bit. ror mixing with php/cake?

so, about performance. it mostly depends on how many users do you think you'll have, who those user are and what they do. 10 per hour or 100 per second? do they look at an image for a long time or are they rapidly hopping from page to page?

here are some tips that are not too technical. no server configuration optimizing, no memcached and so on. start thinking about performance with common sense - it's not the holy grail!

  • is your site/application too slow? most often, that's not the case. it never hurts speeding it up, but often, people care about performance too much. always remember: it's not about being fast, it's about being fast enough. nobody notices some extra milliseconds. a speedup of 50% is noticeable if your page needs a second to load, but mostly irrelevant if it takes only 100ms.

  • to find out if your site is slow, benchmark it. there are a lot of methods to do this, one is automated, like ab (apache benchmark). it simulates lots of users connecting to your site and gives you a nice summary how long it took to respond. the other is: use it. and not in the local network! if you feel it's to slow, then do something.

  • a photo showcase heavily depends on the images. images are big. so make sure your server has enough bandwidth to deliver them fast.

  • if you scale the images (that's very probable), don't resize the image on every page request, cache the scaled image! cache the thumbnails too. cache everything. processing and delivering a static file is a lot cheaper than constantly doing all the processing.

  • think about the quality of the image. is fast delivery more important than high image quality? play around with the image size - better compression means lower file size, lower quality and faster delivery.

  • think about usability. if there is no thumbnails page, people have to sequentially navigate through your library, looking at a lot of photos they don't want to see. if they already see the image, they can jump straight to those that matter (lowering bandwith usage and requests per second). think about flickr: the size of the images shown ... they're like stamps - 500 pixel wide, and people are still happy. if they need a bigger version, they click on the "all sizes" link anyway.

  • tricks, tricks, tricks: earlier, when users surfed with modes, sometimes low resolution/high compression images were transfered, so the user had something after a short amount of time. only after the first image was loaded, the bigger version startet. it's not common anymore, because today most users have broadband, so sending an additional image is just additional workload.

  • think about the audience. are they gonna visit your site with 14.4k modems or broadband? are they used to slow loading sites (photographers probably are)? check your statistics to find out about them.

  • your backend scripting language is most probably not your problem. php is not really fast, ruby is not really fast - compared to, say, c or java or ocaml. frameworks are slower than hand-crafted, optimized code. debug your code to see where the slow parts are. my guess? image resizing and database access. that won't change when switching to another language or optimizing your code.

regarding the speed of websites

there are a lot of factors involved. some of them are:

  1. serverside processing: is your application fast, is your hardware fast?

  2. delivery: how fast are the requests and files transfered from the client to the server and vice versa? (depending on bandwidth)

  3. client side rendering: how fast is their browser, how much work has to be done?

  4. user haibts: do the client even need speed? sometimes, slow pages are no problem, e.g. if they spend a long time there without clicking around. think about flash game sites: if you spend an hour playing a flash game, you'll probably won't even notice if the page loads in 3 or 5 seconds.

the percieved speed - a mixture of all four - is the important metric.

if you've confirmed you are really too slow, be sure to optimize the right part. optimizing the server side scripts is useless if the server is fast enough, but the page takes ages to render on the browser. no need to optimize rendering time if your bandwidth is clogged.

regarding optimization

performance is an integral part when building an application. if you really need something fast, you have to plan for speed from the very beginning. if it's not designed for speed, effective optimization is often not possible.

that's not really true for web apps all the time, because they easily scale horizontally, meaning: throw hardware at it.
all things cost money, and if money is important to you (or your boss), don't forget about it. how much do two weeks of optimising an application cost? say, optimising costs you (or your boss) X € (i'm european) in salary. now, think about buying another server: that costs Y € (including setup). if Y < X, just buy the server and you'll be fine.

random buzzwords

last but not least i'll throw some random (unordered) buzzwords at you, maybe there is something that might help. just google, that should help ...

content delivery networks, (intel) SSDs, sprites (combining images to save requests), page compression (gzip, deflate), memcached, APC (bytecode cache for PHP), minifying and merging of multiple CSS and JS files, conscious handling of HTTP status codes (not changed), separation of static and dynamic content (different servers & domains), step-by-step loading via AJAX (important content first), ...

now i'm out of ideas.

edit/update

things/techniques i forgot:

  • implement a progress bar or something comparable, so users at least feel something's going on. you can't use progress bars if working only with javascript, but at least show some kind of animated hourglass or clock. if you use flash, you can show a real progress bar.

  • you can skip complete page reloads by working with AJAX or flash - just load the data you need. you often see this implemented in flash image galleries. just load the image and the description.

  • preloading: if users look at one image for an extended period of time, you can already start loading the next image, so it's browser-cached if the user continues.

disclaimer

i never implemented performance critical apps (with 2 exceptions), so most of what i've written above is speculation and the experience of others. today you read stories about successfull startups and how they coped (performance-wise) with going from 100 to a bazillion users a day, and how they used nifty tricks to solve all those problems all the time.
but is that going to happen to you? probably not. everyone talks about it, almost nobody really needs it (but i admit, it's still good to know).

my real world experience (yes, i like writing long answers):

once i did parts of a website with several thousand of unique visitors a day, powered by a cms (typo3) and running on a single dedicated samp-server (think of used, decade old solaris servers, not ghz!). you could search for flats, and the form told you how many results you'll have (e.g. 20-40m²: 400 hits, 30-60m²: 600 hits) by reloading an iframe ON-CLICK. it was very, very slow (but users still used it). constantly 100% load. it was my job to solve that problem.
what did i do? first, find out why it was so slow. my first guess was right, the on-click request also used typo3 (w/o caching, of course). by replacing this single action with a custom script that just queried the database directly, bypassing typo3, the problem was solved. load went down to almost nothing. took me about 2 hours.

the other project had about 1500 unique visitors a day, displaying data serverd by an oracle database with millions of rows and complicated joins that took forever (=several seconds) to run. i didn't have much experience in optimizing oracle, but i knew: the database was updated only once or twice a week. my solution: i just cached the contents by writing the html to the filesystem. after updating (in the middle of the night) i cleared the cache and began rebuilding it. so, instead of expensive queries i had just cheap filesystem reads. problem solved.

both examples taught me that performance in web developement is not rocket science. most of the time the solution is simple. and: there are other parts that are way more important 99% of the time: developer cost and security.

Schnalle
+1 great essay on optimization
Miquel
I'm not sure if this will actually relate to the question when we find out what it actually is, but it's good info.
Chuck
Schnalle