views:

182

answers:

4

I'm currently working on a website in PHP, and I'm wondering what the best practices/methods are to reduce the time requests take. I've build the site in a modular way, so a page would consist of a number of modules, and each of these would need to request information.

For example, I have a cart module, that (if a cart is set) will fetch the cart with the id (stored in a session variable) from the database and return its contents. I have another module that lists categories and this needs to fetch the categories from the database.

My system is built with models, and each model might also make a request, for example a category model will make a request to get products in that category.

For those interested, im running the application on Windows Server 2003 with IIS at the moment, but i am hoping to change to linux in the near future. I know this is a broad subject, im just curious about what to look out for and tool to use to help with the load., the answers so far have been very helpful.

+2  A: 

Your high-level approach sounds reasonable, although it would be helpful to weigh such an approach against your actual code, data, and environment. That said:

A quick and easy way to make your code run even faster is to use a tool such as eAccelerator which will cache compiled PHP scripts. That way when a second request comes in the script does not have to be compiled a second time, which improves performance.

If you are developing a site with a large amount of users, you might consider caching data from the database, using a tool such as memcached.

Justin Ethier
Thanks for the info, ill take a look at eAccellerator, have you had any experience with this? As for users, yes this website would be accomodating a large amount of users (eventually) so ill check out memcached, ive also heard about storing the database in RAM.
cast01
Yes, I am using eAccelerator in my app and it has worked out well for us. Unfortunately I do not have any solid metrics for you, but it is essentially transparent to your app, so you can easily test it out if you wish.
Justin Ethier
Did you mean "running your code in your environment and data" instead of "seeing the code"?
Col. Shrapnel
+3  A: 

The only practice/method to reduce the time requests take is called profiling.
First you determine the "bottleneck" - the most slow part of your application
Then speed up this very part of code.

Only that way.

Doing things "just in case" can make things worse.

The "Net" page in the Firebug console is good place to start.

Col. Shrapnel
A: 

Maybe you are looking for tips like these, you can google for this. An example here.

  • require_once() is expensive
  • Use echo’s multiple parameters instead of string concatenation.
  • See if you can use strncasecmp, strpbrk and stripos instead of regex
  • Error suppression with @ is very slow.
  • $row[’id’] is 7 times faster than $row[id]

This is just general good practice but if your code is running slow - you need to profile your app and find out which areas are slow and address these accordingly.

Abs
"Use echo’s multiple parameters instead of string concatenation." Maybe I'm misunderstanding the tip, but it's always been my understanding that multiple calls to `echo` is more expensive than `.=`
dclowd9901
Congrats! You win "most stupid optimization recommendations" badge :)
Col. Shrapnel
Oh no, not this link! Am I allowed to use a word "bullshit" here on SO?
Col. Shrapnel
@Col - hilarious. Now why don't you give reasons why...bullshit isn't one of them.
Abs
All these "advises" not from the real world. Most of them just stupid. In a real application you can't even measure the difference between echo and print for example. Try to learn profiling and and real world tests, such as apache benchmark. Try to learn yourself instead of copy-pasting bunch of nonsense.
Col. Shrapnel
@dclowd9901 No use echo like this echo 'hello', 'me', $var; rather than echo 'hello'.'me'.$var;.
Abs
@Col - Again your statements are mostly insults. Your only problem with my answer is because the improvements are marginal? You will notice I started my question with "maybe you are looking for tips like these" and I ended with the advice of profiling. Read my question fully before you scoff at silly things.
Abs
Not marginal but imaginable. If you think it's marginal - why did you post this ridiculous link here?
Col. Shrapnel
take your own app. use apache benchmark utility to measure it's response time. Do it many times with high concurrency level. Then change all echos to prints and params to concats. Then run tests again. Then see if you can distinguish anything.
Col. Shrapnel
You'll notice that I haven't compared echo or print in my question. If you have a problem with that article, talk to the guy that wrote the post not me. You can question me about those that I have highlighted though...What do you have to say about using require_once or include especially without an absolute path?
Abs
everything the same. you guys have no clue. You never think of PHP itself, of web-server, of OS. All these programs do millions file lookups! While process single HTTP request. Does it make anything slow? Whole directory tree surely cached, and and file lookup take no time. You're talking of profiling. Did you profile your code in real at least once? So, did you notice any significance of include? Include operation itself, not inside code execution? Of course not. Take my advice mate. Do profile your code once or twice. You'll see the real world.
Col. Shrapnel
+1  A: 

require_once() is expensive

Use echo’s multiple parameters instead of string concatenation.

See if you can use strncasecmp, strpbrk and stripos instead of regex

Error suppression with @ is very slow.

$row[’id’] is 7 times faster than $row[id]

All these things are micro-optimization. It's not even worth looking at these until you've already done all the things that actually matter, like caching and reducing the number of unnecessary database queries. I would also recommend downloading Firebug and using the Page Speed tool, it can minify your CSS and optimize your images for you. PHP syntax is rarely large enough to worry about.

Edit: Although using @ is generally bad practice anyway. I've also never found the need to use require_once, __autoload seems to be much easier.

Lotus Notes
Same for $row['id']. It's just syntax issue, not "optimization".
Col. Shrapnel
@Bryonh - you will notice the question asked is ambiguous - we don't even know why the app/site is slow. I gave general suggestions. Btw, I think your response is better suited as a comment to my question rather than an answer to the question.
Abs