views:

315

answers:

11

I am trying to figure out just how "efficient" my server-side code is.

Using start and end microtime(true) values, I am able to calculate the time it took my script to run.

I am getting times from .3 - .5 seconds. These scripts do a number of database queries to return different values to the user.

What is considered an efficient execution time for PHP scripts that will be run online for a website?

Note: I know it depends on exactly what is being done, but just consider this a standard script that reads from a database and returns values to the user. Also, I look at Google and see them search the internet in .15 seconds and I feel like my script is crap.

Thanks.

+1  A: 

This is of course very subjective, depends on the site, etc. etc.

However, I'd say that when a page starts taking longer than around 100 milliseconds, it's a noticeable delay for the user, and that might be "too long". That's if this is a page that can reasonably be expected to load instantaneously. If the page is a search page, doing a fulltext search in a large database, the situation is of course different.

calmh
A: 

With PHP, most websites are generated so fast that the biggest delay is the page rendering, including all sub-requests such as images to display.
But also the visitor's internet connection speed and quality, his computer and his software are important factors.

To be generous, let's say PHP takes 20% of the total load & rendering time of a web page.
And again, I know this percentage is very approximative, but it's more for an illustrative example.

An average page loading time is around 3 seconds. (which is too much) Good quality websites should take about 1 second to be fully loaded, so PHP will be allowed 200ms (20% of 1 sec) to generate the output. So php could take up to 600ms for an "average" website.

Note: The PHP execution time can be improved by changing your hoster, or by improving your source code.

Daan
+1  A: 

I'd say 10 times less would be ok. The number of queries doesn't matter though. There can be 20 of them, all running for 0.005 sec. The quality does matter, not the quantity. Profile your code to determine most slow parts, by adding some more microtime statements, find a most slow part and then optimize it.

If you have your own function for query mysql, to place microtime stuff there would be very handy

Col. Shrapnel
+2  A: 

YouTube's target page rendering time is < 100ms (Video here @7:00).

Your bottleneck is probably DB queries - try using

EXPLAIN select * from x...

to see if you can add indexes that will speed up your queries.

Andy
+2  A: 

Depends, like stated, but in addition consider this: with a one second execution time, you will be able to serve (under ideal conditions) only one request per second on a server machine with one CPU and where there is nothing else happening on that machine. If you have more requests than one per second coming in, you will get a long queue, and your server will run flat out, causing incoming requests to take even longer to process. If you get less requests you still need to pay attention to your CPU utilization. If the server is heavily loaded already before, you may have a problem that need to be attended to.

There are mathematical methods (queueing theory) that can be used to analyze capacity requirements, see for example PDQ (http://www.perfdynamics.com/Tools/PDQ.html) for more.

Comparing to Google may not hence be fair, since they must have massive amounts of incoming requests, and with 3 times longer execution time they would need several times more servers than they already have...

aaspnas
A: 

Hum... I'm not sure an absolute value is quite fair here. It really depends on the hardware... When I develop locally, my developper machine runs something like 5-10 times slower than the actuel server. So if we take an absolute value, the "acceptable" range would vary depending on the hardware.

Well, generally I try to keep things below 100 ms. If the server load time is higher, I'll trace the execution and try to figure out what's wrong. I have to say most of the time, the databse (hence the queries) are the bottleneck. A real work on that is really important.

Savageman
A: 

Aim for < 200ms.

People increasingly start losing patience for tings that take > 200ms.

leeeroy
A: 

It's all relative, really. Don't expect to get times on par with other sites using PHP. Remember, PHP needs to load everything from scratch on each page load.

Really you want to see how well your site does under load, like using Apache ab to test it. If your site can handle the highest traffic level you can expect, then you don't need to optimize it anymore. A user isn't going to be able to tell if your page loads in .75 seconds or .25 seconds.

Remember, calling microtime itself is going to add time to your page load since it has to make a call out to the operating system (context switch). It may be more valuable to optimize the page, making it smaller so it goes over the network faster and renders faster once on the client.

Brent Baisley
+1  A: 

I notice that, as an editor on Wikipedia, we don't see complaints until page load initiation exceeds somewhere around 5 to 10 seconds. Of course the mechanism for reporting such slowness is obscure to most users.

For myself—as a user of travel websites—I am adequately placated by an intermediate screen which says "Got your request. It's now processing. It might take up to X seconds."

wallyk
A: 
  • I wouldn't compare your script with Google, unless you're maintaining a similar page-ranking.. etc.

  • If the search merely retrieves values from a database the speed may be improved some by profiling the application and eliminating bottle-necks (to mention a few - script on page, large images, large tables, database indexes)

Everyone
A: 

Does it need to be faster and why?

If the answer is "Yes, because it's on the requirements list" or "Because it takes valuable server resources", then try to optimize your SQL queries. Maybe you need to add index(es)...

Otherwise, I think you should move on to the next task. First, it works and second you are talking about .3 to .5 sec. which is fast enough for humans and machines. Let it go, man :)

Cheers

lunohodov