views:

156

answers:

9

The question might prompt some people to say a definitive YES or NO almost immediately, but please read on...

I have a simple website where there are 30 php pages (each has some php server side code + HTML/CSS etc...). No complicated hierarchy, nothing. Just 30 pages.

I also have a set of purely back-end php files - the ones that have code for saving stuff to database, doing authentication, sending emails, processing orders and the like. These will be reused by those 30 content-pages.

I have a master php file to which I send a parameter. This specifies which one of those 30 files is needed and it includes the appropriate content-page. But each one of those may require a variable number of back-end files to be included. For example one content page may require nothing from back-end, while another might need the database code, while something else might need the emailer, database and the authentication code etc...

I guess whatever back-end page is required, can be included in the appropriate content page, but one small change in the path and I have to edit tens of files. It will be too cumbersome to check which content page is requested (switch-case type of thing) and include the appropriate back-end files, in the master php file. Again, I have to make many changes if a single path changes.

Being lazy, I included ALL back-end files inthe master file so that no content page can request something that is not included.

First question - is this a good practice? if it is done by anyone at all.

Second, will there be a performance problem or any kind of problem due to me including all the back-end files regardless of whether they are needed?

EDIT

The website gets anywhere between 3000 - 4000 visits a day.

+1  A: 

It will slow down your site, though probably not by a noticable amount. It doesn't seem like a healthy way to organize your application, though; I'd rethink it. Try to separate the application logic (eg. most of the server-side code) from the presentation layer (eg. the HTML/CSS).

Johannes Gorset
Even if I do that, there is this problem of including several shared files. I hope I am making sense. Assume I remove presentation stuff. But business logic will have its own set of php files. All these biz logic files will require a variable number of common php files. Same include problem comes again. What if I rearrange/add/delete/merge some common files? I have to go to each biz logic php file and change the include paths. If I have a good idea of the performance hit, I can make a choice.
thedeveloper
+4  A: 

You should benchmark. Time the execution of the same page with different includes. But I guess it won't make much difference with 30 files.

But you can save yourself the time and just enable APC in the php.ini (it is a PECL extension, so you need to install it). It will cache the parsed content of your files, which will speed things up significantly.

BTW: There is nothing wrong with laziness, it's even a virtue ;)

soulmerge
Hi, thanks for suggesting that APC thing :) Looks like it'll come in handy **if** I stick to this way of including files.
thedeveloper
Just did some research on APC and looks like it is what I need. I installed APC. Have to profile with and without it. But from what people say, it is a good thing to have regardless of noticeable performance differences.
thedeveloper
A: 

it's not a bad practice if the files are small and contains just definition and settings. if they actually run code, or extremely large, it will cause a performance issue. now - if your site has 3 visitors an hour - who cares, if you have 30000... that's another issue, and you need to work harder to minimize that.

Dani
A: 

I live by "include as little as possible, as much as necessary" so i usually just include my config and session handling for everything and then each page includes just what they need using an include path defined in the config include, so for path changes you still just need to change one file.

If you include everything the slowdown won't be noticeable until you get a lot of page hits (several hits per second) so in your case just including everything might be ok.

dbemerlin
A: 

You can migitate some of the disadvantages of PHP code-compiling by using XCache. This PHP module will cache the PHP-opcode which reduces compile time and performance.

Wimmer
+1  A: 

If your site is object-oriented I'd recommend using auto-loading (http://php.net/manual/en/language.oop5.autoload.php).

This uses a magic method (__autoload) to look for a class when needed (it's lazy, just like you!), so if a particular page doesn't need all the classes, it doesn't have to get them!

Again, though, this depends on if it is object-oriented or not...

Dan Beam
Hey, thanks! :) A large portion of my code is OO. I'll check out the link..
thedeveloper
Although I use autoload all the time (it's great), it's a bad advice when speed is an issue. Autoload is way slower than including everything, and even still slower when things are selectively manually included. Lastly, it's a bad practice to use __autoload. Use spl_autoload instead.
Evert
A: 

Considering the size of your website; if you haven't noticed a slowdown, why try to fix it?

When it comes to larger sites, the first thing you should do is install APC. Even though your current method of including files might not benefit as much from APC as it could, APC will still do an amazing job speeding stuff up.

If response-speed is still problematic, you should consider including all your files. APC will keep a cached version of your sourcefiles in memory, but can only do this well if there are no conditional includes.

Only when your PHP application is at a size where memory exhaustion is a big risk (note that for most large-scale websites Memory is not the bottleneck) you might want to conditionally include parts of your application.

Rasmus Lerdorf (the man behind PHP) agrees: http://pooteeweet.org/blog/538

Evert
To answer the question in your first line - If I am neck deep in this method of including, and **marketing** pulls off a miracle and then the website gets 50000 visits a day, I'll be in trouble.
thedeveloper
500000 visits is still not a cause for concern. I suggest spending times on things like webserver configuration and sql queries in that case.
Evert
A: 

As others have said, it shouldn't slow things down much, but it's not 'ideal'.

If the main issue is that you're too lazy to go changing the paths for all the included files (if the path ever needs to be updated in the future). Then you can use a constant to define the path in your main file, and use the constant any time you need to include/require a file.

define('PATH_TO_FILES', '/var/www/html/mysite/includes/go/in/here/');

require_once PATH_TO_FILES.'database.php';
require_once PATH_TO_FILES.'sessions.php';
require_once PATH_TO_FILES.'otherstuff.php';

That way if the path changes, you only need to modify one line of code.

Vex
A: 

Hi,

It will indeed slow down your website. Most because of the relative slow loading and processing of PHP. The more code you'd like to include, the slower the application will get.

koko