views:

51

answers:

6

Is there any reason to NOT have a webpage retrieve it's main content on the fly?

For example, I have a page that has a header and a footer, and in the middle of this page is an empty div. When you click on one of the buttons in the header, an http GET is done behind the scenes and the .innerHTML() of the empty div is replaced with the result.

I can't think of any reason why this might be a bad idea, but I can't seem to find any pages out there that do it? Please advise!

A: 

Where did you read it is a bad idea? It purely depends on requirements whether or not content will be populated on-the-fly. In most cases, however, the content is loaded along with the page not on-the-fly but if you need your content on-the-fly, it shouldn't be a bad idea.

If your content is loaded via javascript and javascript is disabled on users' browser then definitely it is a bad idea.

Sarfraz
+1  A: 

Without extra work on your part it kills the back and forward history buttons, and it makes it difficult to link to the pages each button loads. You'd have to implement some sort of URL changing mechanism, for example by encoding the last clicked page in the URL's hash (e.g. when you click a button you redirect to #page-2 or whatever).

It also makes your site inaccessible to users with JavaScript disabled. One of the principles of good web design is "graceful degradation"--enhancing your site with advanced features like JavaScript or Flash or CSS but still working if they are disabled.

John Kugelman
+1  A: 

Two considerations: Search engine optimization (SEO) and bookmarks.

Is there a direct URL to access your header links? If so, you're (almost) fine. For example, the following code is both SEO friendly and populates your page as you desire:

<a href="seoFriendlyLink.html" onclick="populateOnTheFly(); return false;">Header Link</a>

The catch occurs when people attempt to bookmark the page they've loaded via JavaScript... it won't happen. You can throw most of those potential tweets, email referrals, and front page Digg/Reddit articles out the window. The average user won't know how to link to your content.

Dolph
A: 

I cant think of a bad reason for this either (other than possibly SEO), one thing that would probably be a good idea is to load the data only once. ie

<a href="javascript: showdiv1()">Show Div1</a> - do ajax/whatever only if the innerhtml is blank

<a href="javascript: showdiv1()">Show Div2</a> - do ajax/whatever only if the innerhtml is blank

<div1></div>
<div2></div2>

This should keep the server load down so the divs content is only loaded once.

Cheers

Chief17
+4  A: 

It's not unheard of, but there are issues.

  • The obvious one is that some users have javascript turned off for security reasons, and they will not be able to use your site at all.

  • It can also negatively impact handicapped users that are using assistive technology such as a screen reader.

  • It can make it harder for the browser to effectively cache your static content, slowing down the browsing experience.

  • It can make it harder for search engines to index your content.

  • It can cause the back and forward buttons to stop working unless to take special steps to make them work.

  • It's also fairly annoying to debug problems, although certainly not impossible if you use a tool such as Firebug.

I wouldn't use it for static content (a plain web page) but it's certainly a reasonable approach for content that is dynamically updated anyway.

JacobM
Great - actually this is for a page that will be used exclusively in-house at a tech company, so most of those issues aren't too important to me (javascript on/off, assistive tech, search engine indexing). Thank you for the quick response - I'm gonna give it a try and see how it goes.
jcovert
A: 

This is pretty standard behavior in ajax enabled sites.

Keep in mind however that extra effort will be needed to:

  • ensure the back button works
  • link to (and bookmark) specific content
  • support browsers with javascript disabled.
Josh Sterling