tags:

views:

95

answers:

5

I have recently taken on a php web developer position for a large firm. They have multiple sites that share much of the same content (mainly news articles). All the sites although written in PHP, the content is static and manually edited on all the sites when a news article is added or other information is changed. The "senior programmer" there is against using a cms for some reason, against upgrading to php 5, against installing mod_rewrite, basically I have to work within a very small parameter.

I have taken the first 3 weeks to write a whole lot of classes to make some sort of sanity out the mess, but I have to find a way to replicate this data in an easy manner. I am thinking something without the need of a database (the head guy doesnt want to decentralise data so databases anywhere other than the intranet are a no-no), I just want a centralised XML file or something, even if I need to hand edit it... any ideas???

A: 

User one folder for all documents which your websites consume.

Koistya Navin
all the sites are on different hosts..
about 6 different sites.. 2 different languages, which = 12 diffrerent changes every time I need to add something..
A: 

You could use CURL to access what you need over HTTP. Setting the CURLOPT_RETURNTRANSFER option to 1 will allow you to get whatever content you would like. XML, HTTP, etc.

Jesse Dearing
+1  A: 

Treat PHP as a templating engine, and have all the main pages pull the html marked up (or xml) articles from an RSS feed from one site.

Edit the feed to add a new article, add the marked up article to the main site, and it's all very, very simple, easy to understand, and scales.

No need to involved a 'real' cms, database, or anything.

Adam Davis
Then you could easily have the RSS feed available to those with feed readers. Nice solution! +1
Jesse Dearing
this is what i initially had in mind although my experience with rss as a developer is nil
rss is just a very simple xml link file. You can edit it by hand a few times before you start using a tool if you want to get used to it.
Adam Davis
thanks dudes, I got it sorted, using magpie to read a static xml file which I need to update irregularly..
A: 

Have one master site, on which the original content is created & edited.

On all others, use the Apache ErrorDocument handler in .htaccess to route 404 errors to a new php script.

ErrorDocument 404 /syndicate.php
In syndicate.php, take the uri (from $_SERVER['REQUEST_URI']), fetch the contents from the original domain and output it with a 200/OK header (so it's not seen as a 404).
$master_domain = 'http://master.com';

$master_html = file_get_contents($master_domain . $_SERVER['REQUEST_URI']);

if($master_html != '') {
header("HTTP/1.0 200 OK");
echo $master_html;
} else {
header("HTTP/1.0 404 Not Found");
}

The duplicate content will still be under the requested url.

adam
not all the sites have the basic necessities such as AllowOverride directives
A: 

If the senior programmer is against industry standard tools like CMSes, PHP5 (at least when considered against PHP3/PHP4 - no need for a language holy war), mod_rewrite, etc. it's time to involve management. Such an attitude is unacceptable.

ceejayoz
im still on my trial period, i am trying to keep a low profile as far as conflict is concerned..
My first week of my trial period I pushed hard to abandon Joomla in favour of Drupal, and it worked out well. Not necessarily a bad thing to have a little conflict, especially when you're clearly correct.
ceejayoz