tags:

views:

69

answers:

4

By "common script startup sequence", what I mean is that in the majority of pages on my site, the first order of business is to consult 3 specific files (via include()), which centrally define constants, certain functions used in many scripts, and a class or two, as well as providing the database credentials. I don't know if there's a more standard term for such a setup.

What I want to know is whether it's possible to have too many of these and make things slower as a result. I know that using include() has a certain amount of overhead because it's another file to look for in the filesystem, parse, and execute. If there is such a thing as too many includes, I want to know whether I am anywhere near that point. N.B. Some of my pages include() still more scripts that they specifically, individually need (for example, a script that defines a function used by only a few pages), and I do not count these occasional extra includes, which are used reasonably sparingly anyway. I'm only worrying about the 3 includes that occur on the majority of pages and set everything up.

What are the 3 includes?

Two of them are outside of webroot. common.php defines a bunch of functions, classes and other things that do not vary between the development and production sites. config.php defines various constants and paths that are different in the development and production sites (which database to connect to, among other things). Of course, it's desirable for this file in particular to be outside of webroot. config.php include()s common.php at the bottom.

The other one is inside webroot and contains a single line:

include [path to appropriate directory]/config.php

The directory differs between the development and production sites.

(Feel free to question the rationale behind setting up the includes this way, but I feel that this does provide a good, reliable system for preparing to execute each page, and my question is about whether it is bad to have that many includes as a baseline on each page.)

+2  A: 

Use APC and your worries go away. The opcode of your files will be cached in the RAM and everything will go super fast. :) Facebook does this so it'll definitely help you to scale.

Because you may not notice any difference between 1 include or 50 in terms of speed, but for an application with high concurrency, I/O can be a huge bottleneck. So the key is not speed, but scaling.

galambalazs
A: 

I dont believe the performance has anything do with no of includes, because think of a case where one included file contains 500 lines of codes and in another case you have 50 included files with just one line of code each.

Starx
I/O can be a huge bottleneck of an application. Accessing 50 files for each request can mean a large overhead.
galambalazs
Try to write a test to see it for yourself. In my case the bottleneck was harddrive. Anything which can reduce IO is appreciated. Use less files, less IO and use accelerator of some sort.
dwich
A: 

Or if you by any chance using Windows as OS, you can use WinCache.
http://php.net/manual/en/book.wincache.php

kanenas.net
+1  A: 

The best thing to do is use an accelerator of some kind, APC or eAccelerator or something like this to keep them cached in RAM. The reasons behind this are quite a few and on a busy site it means a lost.

For example a friend did an experiment on his website which has about 15k users a day and average page load time of 0.03s. He removed most of the includes which he used as templates - the average load time dropped to 0.01 secs. Then he put an accelerator - 0.002 secs per page. I hope those numbers convince you that includes must be kept as little as possible on busy sites if you don't use an accelerator of some kind.

This is because of the high I/O which is needed to scan directories, find the files, open them, read them and so on.

So keep the includes to minimum. Study the most important parts of your site and optimize there by moving required parts to general includes and so on.

bisko