views:

38

answers:

3

I am working on a PHP Application, Every thing works perfectly, The only problem is.

I have enabled SEO Friendly URL's, Which re-writes the actual URL's to virtual URL's( i know you guys know it )

Ex : hxxp://www.website.com/index.php?page=about-us To hxxp://www.website.com/page/about-us/

What i want to achieve is If the SEO URL's / Mod Rewrite is disabled, the user should be able to access the direct/actual URL's.

In brief, If Mod-Rewrite is enabled, the web application should automatically use the SEO Friendly URL's otherwise go with the default URL's.

+1  A: 

You would have to replace all occurrences of links with a function that checks if mod_rewrite is available, or more likely, a config value. It would then return the appropriate link.

getLink("?page=about-us")

Byron Whitlock
A: 

Use an <IfModule> to avoid breaking other .htaccess directives and or 500 internal server errors if Apache doesn't understand your rules. Also add a single non-rewriting rewriterule (before all others);

<IfModule mod_rewrite.c>
    RewriteEngine On
    #The next rule does no rewriting, but sets en environmental variable.
    RewriteRule .* - [E=RewriteCapable:On]
</IfModule>

In your file (store as setting or check on places generating/outputting urls):

if(isset($_SERVER['RewriteCapable'])){
    //make fancy urls
} else {
    //cludgy old-style urls
}
Wrikken
A: 

Hmm.. this may need thought about a bit to get the correct solution.. follow me here if you will :)

SEO URLs were primarily introduced to (1) include human readable text in the URLs and (2) to get rid of the GET parameters.

To look at point (2) for a moment, this was the primary driver initially, because people used about.php?id=1, id=2 ... id=3457348 to get the same page listed in the search engines multiple times, which of course got detected and stopped, then sometimes people would pass a session id=24234234 which would also get stopped as being a duplicate page (rightfully as it uses HTTP as a stateful protocol when it's not).

With an URL, everything from the first char up to a the # of a #fragment defines a resource (from an HTTP perspective), so rightly so when several different URLs all resolve to the same 'page' they are indeed duplicates.

So, by negating the GET parameters you solve this problem, which now isn't a problem by the way and hasn't been for a long time, there's no reason not to use GET params properly other than vanity.

So, really you solve no problem but have instead introduced a new problem, in that you want '/page/about-us' and '?page=about-us' to both go to the same 'page' which means you've got duplicate resources again and this could be detected and you could get penalised.

Thus, by introducing 'SEO URLs' you've actually created the problem SEO URLs were 'invented' to counteract.

This only leaves the point about human readable words in the URL. URLs are supposed to be transparent so they don't count for anything in reality, but some still like - so I'd have to ask what's wrong with using '/?/page/about-us'... and if you don't like that then whats wrong with creating a fixed file with the filesystem path '/page/about-us' which simply includes your index.php with the right variables set?

Of course you can create duplicate pages and have both SEO friendly urls and GET param URLs but as you can see that won't be SEO friendly now will it?

Something to chew on :)

nathan
Well, I taught of that, and still thinking about it. But i want to give the users ability to have the URL's as the way they want it. (sort of like the permalinks feature of wordpress).
Roccos