views:

445

answers:

14

What are some important optimizations that can be made to a website to reduce the loading time?

+2  A: 

Before attempting any optimizations first you need to be able to profile, get FireBug for Firefox. Then you can run some analysis that will tell you exactly what to do using YSlow. Fundamental things that you should do are listed here.

fuzzy lollipop
+1  A: 

Here are a few "best practice" things:

  • Caching CSS, JavaScript, images, etc.
  • Minifying Javascript files.
  • gzip content.
  • Place links to JavaScript files, JavaScript code, and links to CSS files at the bottom of your page when possible.
  • Load only what is necessary.
  • For an existing website, before you do any of this determine where your bottlenecks are with tools like Firebug and as someone else mentioned YSlow (I highly recommend this tool).
bn
+1  A: 

There are two sides you can care about, when optimizing :

  • The server side : what matters is generating the ouput faster
  • The client side : what matters is getting all that has to be displayed faster.

Note : we, as developpers, often think about optimizing the server-side first... Which in most cases only represents less than 10% percent of the loading-time of the page !


On the server side, you'll generally want to :

  • profile, to determine what's long
  • optimize your SQL queries, and reduce their number
  • use caching

For more informations, you can take a look to the answer I gave some time ago to this question : Optimizing Kohana-based Websites for Speed and Scalability


On the client side, the biggest gains are generally achieved by :

  • Reducing the number of HTTP requests -- the easiest way being to reduce the number of JS/CSS/images files, by combining several files into one
  • Compressing CSS/JS/HTML, using for instance Apache's mod_deflate.

About that, there is a lot of great stuff on Yahoo's Exceptional Performance : they've released lots of good pratices and tools, such as yslow.

Pascal MARTIN
+12  A: 

Remove/Minimize any bottlenecks on the server side. For this purpose, use a profiler like Xdebug or Zend Debugger to find out where your application is doing expensive and slow operations. Implement caching where possible. Use an OpCode Cache. If this still isn't fast enough consider investing in more CPU or RAM or SSDs (depending on whether you are CPU, IO or Memory bound)

For general server/client side optimizations, see the Yahoo YSlow! User Guide.

It basically sums it up to:

  1. Minimize HTTP Requests
  2. Use a Content Delivery Network
  3. Add an Expires or a Cache-Control Header
  4. Gzip Components
  5. Put StyleSheets at the Top
  6. Put Scripts at the Bottom
  7. Avoid CSS Expressions
  8. Make JavaScript and CSS External
  9. Reduce DNS Lookups
  10. Minify JavaScript and CSS
  11. Avoid Redirects
  12. Remove Duplicate Scripts
  13. Configure ETags
  14. Make AJAX Cacheable
  15. Use GET for AJAX Requests
  16. Reduce the Number of DOM Elements
  17. No 404s
  18. Reduce Cookie Size
  19. Use Cookie-Free Domains for Components
  20. Avoid Filters
  21. Do Not Scale Images in HTML
  22. Make favicon.ico Small and Cacheable

Also see the comments contributed below, as they contain some additional useful information for other users.

Gordon
Also check out Google' PageSpeed http://code.google.com/speed/page-speed/ its a good alternative/complement to YSlow and is also a Firebug addon. It catches some things that YSlow doesn't, and vice-versa.
Alex
One more thing: Use in-memory sessions if you control enough of the server to install additional plugins and have enough RAM. The memcache plugin includes a session handler. http://php.net/memcache
R. Bemrose
Good answer. All these things are explained in more detail on this (excellent) Yahoo article: http://developer.yahoo.com/performance/rules.html
Tom Castle
Google Speed Tracer is another good tool to use, it's a Chrome extension that measures a lot of different aspects of site performance: http://code.google.com/webtoolkit/speedtracer/
Jason Hall
+1  A: 

The simple options I can think of are:

  1. Gzip (x)html, so a compressed file should arrive more quickly to the user
  2. minify the CSS
  3. minify the JS
  4. use caching where possible
  5. use a content-delivery network
  6. use a tool, such as yslow to identify bottlenecks and further suggestions
David Thomas
@bn, indeed, edited to correct. Thanks =]
David Thomas
+1  A: 

definitely want to look at caching, as round trips to DB are expensive. also, minify JS

pjacko
A: 

To reduce network traffic, you can minify static files, such as CSS and Javascript, and use gzip compression on generated content. You can also try using tools such as optipng to reduce the size of images.

However, the first step to take is to actually analyse what's taking all of the time -- whether it's sending the bits over the network, or actually generate the content to send. There's no point making your CSS files 10% smaller if it takes a minute to generate each HTML page.

Michael Williamson
A: 

Load balancing would help to reduce the loading time immense.

streetparade
No, it wouldn't, in almost all cases.
MarkR
+1  A: 

The first optimisation is: Decide if it is slow, and if not, don't bother.

This is trickier than it sounds, because it's not like testing a desktop app or game. A game is slow if when you play it on the target hardware, the frame rate is too low. This is very easy to measure.

A web site is trickier, because you, as the developer, are probably using a local test system with a very fast network. Even when you use your staging / system test servers, you're probably still on the local network. Even your production servers are in all likelihood, on the same continent.

The same is possibly not true for quite a lot of your users.

Therefore the options which exist are:

  • Find out by asking your users, whether they find it to be slow
  • Simulate a high latency environment and test it yourself (or your QA team)
  • Guesswork

The latter is not recommended.

An option which the holier-than-thou Yahoo Web Sites performance book (which yes, is a book you can buy) doesn't mention a lot is HTTPS. Most web applications which handle important data run mostly or entirely over HTTPS, which changes the rules of the game rather a lot. Remember to do all testing with it enabled.

MarkR
+1  A: 

install firebug and pagespeed plugin follows all the pagespeed directives (until possible) and be happy http://code.google.com/intl/it/speed/page-speed/

anyway the most importante optimization in my experience is to reduce the number of HTTP requests to a minimum...

luca
A: 

Don't use whitespace in code.

scopus
A: 

We recently did this on our web site. Here we outlined nine techniques that seemed to have the highest impact with the least difficulty: http://mentormate.com/blog/easy-ways-speed-website-load-time/

Andy
A: 

i wrote some things about, see:

Google page speed test optimization

cybernetica
A: 

As already mentioned, you can use Yslow or PageSpeed firefox extension. But you can also use GTmetrix, an online service scanning your page with both tools.

Features I like / use:

  • soft, clean and usable presention
  • comparison with another page. It's really interesting to see where are your friends / competitors.

(by the way, i'm not related to gtmetrix !)

Matthieu FAURE