views:

340

answers:

5

I'm writing an application that gets data from URLs, but I want to make it an option whether or not the user uses "clean" urls (ex: http://example.com/hello/world) or "dirty" urls (ex: http://example.com/?action=hello&sub=world).

What would be the best way to get variables form both URL schemes?

A: 

Regular Expressions. Here's a link.

madcolor
A: 

sounds like a pain, but i guess create a function that analyzes the URL for every request. First determine if it's "dirty" or "clean" URL. For that, I would first look for the presence of the question mark character and proceed from there (additional checking will obviously be requried). For the "dirty" URL, use PHP's normal Get method retrieval capabilities ($variable_name). For the "clean" one, I would use Regular Expressions. That would be the most flexible (and efficient) way to parse the URL and extract potential variables.

alex
+3  A: 

If your mod_rewrite has a rule like the following:

RewriteRule ^hello/world /?action=hello&sub=world [NC,L]

or, the more generalised:

// Assuming only lowercase letters in action & sub..
RewriteRule ^([a-z]+)/([a-z]+) /?action=$1&sub=$2 [NC,L]

then the same PHP script is being called, with the $_REQUEST variables available whichever way the user accesses the page (dirty or clean url).

We recently moved a large part of our site to clean urls (still supporting the older, "dirty" urls) and rules like the above meant we didn't have to rewrite any code that relied on $_REQUEST params, only the mod_rewrite rules.

Update

Mod_rewrite is an Apache module, but there are a number of options available for IIS also.

Whichever web server you decide to support, the mod_rewrite approach will likely result in the least amount of work for you. Without it, you'd likely have to create a load of files to mimic the structure of your clean urls, e.g. in your webserver root you'd create a directory hello, placing a file world into it, containing something like the following:

// Set the $_REQUEST params to mimic dirty url
$_REQUEST['action'] = 'hello';
$_REQUEST['sub'] = 'world';
// Include existing file so we don't need to re-do our logic
// (assuming index.php at web root)
include('../index.php');

As the number of parameters you wish to handle 'cleanly' increases, so will the number of directories and stub files you require, which will greatly increase your maintenance burden.

mod_rewrite is designed for exactly this sort of problem, and is now supported on IIS as well as Apache, so I'd strongly recommend going in that direction!

ConroyP
The only problem here is that I want to support more than just Apache, so using a .htaccess wouldn't work.
Vestonian
+1  A: 

If you're application is running in Apache server, I would recommend the use of mod_rewrite.

Basically, you code your application to use "dirty" URLs inside. What I mean by this is that you can still use the "clean" URLs in the templates and such, but you use the "dirty" version when parsing the URL. Like, you're real and "diry" URL is www.domain.com/index.php?a=1&b=2, inside of your code you are still going to use $_GET['a'] and $_GET['b']. Then, with the power of mod_rewrite just make the URLs like www.domain.com/1/2/ point to the "dirty" URL. (this is just an example of how things can be done)

Nazgulled
A: 

A quick and dirty way might be to simply check for GET variables. If there are any, it's dirty, if not, it's clean. Of course this depends on what exactly you mean by dirty URLs.

codelogic