views:

64

answers:

2

Now here is my situation: I'm making a CMS. When links are clicked, i would like the pages to load dynamically using Ajax. The problem in the links!

The only way to change the address in the address bar real-time is to use anchor tags. But PHP doesn't get the anchor tags, thus I can't load page content on site load using PHP. And if i were to load the pages using a query string, the query string couldn't be updated in the address bar at the click of a link, since that would reload the page.

I suppose Javascript could check the address, save the anchor tag in a cookie and reload the page, but I'd rather not have to go to such lengths.

Does anyone know a solution to this problem?

+3  A: 

There was a similar question not so long ago, and I came up with the following solution.

Your urls should point to real pages in order to get it work in for js disabled users. The click handlers should handle ajax requests. Hash should contain the url, and a part like &ajax to indicate the type of the request.

If the request is from ajax simply send the content. If it is not, wrap the content into header and footer to respond with a full site.

Urls should work with linking to ajax generated hashes and using them as links. The whole idea is basically mimics the kind of behavior you can see on facebook.

Javascript

// click handler for ajax links
function goToWithAjax(hash) {
  hash = hash.href ? hash.getAttribute("href", 2) : hash;
  ajax( hash, function( response ) {
    document.getElementById("content").innerHTML = response;
  });
  hash = ("#!/" + hash).replace("//","/");
  window.location.hash = hash;
  return false;
}

.htaccess

auto_prepend_file = "prepend.php"  
auto_append_file  = "append.php"  

prepend

$url   = $_SERVER['REQUEST_URI'];
$parts = explode('#!', $url);
$hash  = isset($parts[1]) ? $parts[1] : false;

// redirect if there is a hash part
if ($hash) {
  header("Location: $hash");
}

// find out if it's an ajax request
$ajax = strstr($url, "&ajax");

// we need header if it's not ajax
if (!$ajax) {
  get_header();
}

append

// we need footer if it's not ajax
if (!$ajax) {
  get_footer();
}

get_header()

function get_header() {

echo <<< END
<html>
<head></head>
<body>
<div id="page">
  <div id="header">
    <div id="logo"></div>
    <ul id="nav">menu...</ul>
  </div>
  <div id="content">
END;

}

get_footer()

function get_footer() {

echo <<< END
  </div> <!-- end of #content --->
  <div id="footer">(c) me</footer>
</div> <!-- end of #page --->
</body>
</html>
END;

}
galambalazs
I second this solution, it's a pretty common implementation that you see at Facebook, etc. Note - if you want to expose the content to spiders make sure you've got a sitemap available somewhere.
infamouse
the links are all links to full pages. Spiders get the content just as they would in a WEB 1.0 page. But yeah, sitemaps are needed anyway. :)
galambalazs
The prepend/append approach is not very powerful. It would be difficult to build complex sites with that.It's better to have a full version of the page and several callable partials of the different parts of it, by using an MVC approach.
Sebastián Grignoli
A: 

I can see why you might want to load parts of the page with ajax. A whole page is rather pointless though.

A jQuery solution might be something like:

$(a.ajax_link).click(function(){
  var url = $(this).attr('href');
  $.ajax({
    url:url,
    success:function(data) {
      $('body').html(data);
      return false;
    }
  });
});

I have in no way tested that, but it should still work without javascript enabled.

Keyo
That had absolutely nothing to do with my problem what so ever. Sorry
Codemonkey