views:

94

answers:

2

Hi, I'm wondering about this one thing. I'm starting to code a webpage where I basically want the same design all over. Where only the content is changing. So I don't want to load entire new pages where basically the same stuff gets loaded over and over again. I want to load only the contents.

Now, I'm presented with two possible solutions. Either using some sort of a jQuery content slider. Where the entire content is huuge, but I choose to show only the parts I want. And when clicking links, the position of the content within the div changes.

Or the other solution where I've got a separate file with huge amounts of div's. And where clicking links basically empties the div and loads the content of a chosen div from that other file.

What solution would be the best? Thinking both in general and programming wise? I'm expecting quite alot of php programming here and I want to reduce the amount of load as well, especially since the initial website load contains some serious heavy images.

+1  A: 

Keep in mind that using partial requests you will lose browser navigation, your app will look just like flash in terms of usability, and w/e gimmick you use to fix that will be still a hack.

You mention content slider, if you want the slide effect itself it will be complex to move away from the partial request or preloaded content.

If you don't need to actually 'slide' the content, make your pages have all external resources cached in the browser (all css, all javascript, all images are loaded externally) and your page itself only has the (hopefully as minimalist as possible in terms of tags) html content. That way with everything in the cache you won't see any extra bandwidth being used but the content itself, very similar to what you would get with an ajax request for example.

Sometimes the misuse of technologies are for the sole reason that we are not using the current (and very simple) ones properly, and we try to come up with complicated solutions when everything is at hand.

F.Aquino
"Keep in mind that using partial requests you will lose browser navigation, your app will look just like flash in terms of usability, and w/e gimmick you use to fix that will be still a hack."....Wow is this completely untrue and off the mark. #tab1, #tab2 for navigation comes to mind. If he has 20 pages of content and the user wants 3 your solution to **save bandwidth** is to load it all? If you don't know how to use the technologies, that's perfectly ok, but saying something can't be done when it can and there's thousands of examples is terrible advice.
Nick Craver
Maybe he means that the browser back/forward buttons won't work.
Pointy
@Pointy - They will if you're using hash navigation, just `<a href="#tab1">Tab1</a>`, `<a href="#tab2">Tab2</a>`, go back/forward.
Nick Craver
I'm thinking of a solution aka Facebook actually :pWhere the content is loaded dynamically but back/forward buttons still work. Because of the url probably. It should also be possible to link to specific parts of the page using url's. I think the second solution is the best. But it's really the whole framework I'm thinking the most about.How would the file structure of the site look like? Say I've got a controlpanel for users. Would that be a separate php file?
Kenny Bones
@Nick Craver hash navigation for partial requests is a hack, if you can't simplify markup to match a partial request in length for a full page content (this scenario) the problem is deeper than the buzzword technology you are trying to apply.
F.Aquino
@F.Aquino - Really? Hashes are a hack? They're supposed to **go to a portion of a page**. Maybe you should read the spec again, they're in there: http://www.w3.org/TR/1999/REC-html401-19991224/intro/intro.html#fragment-uri It's not "buzzworthy", it's 2010 and practically standard now. But we could go your route, let's use it for computers too, we'll install every piece of software the user might ever need, sure it'll be terabytes of data, but then they won't have to get only what they need...see how much sense that makes?
Nick Craver
I agree with Nick. Just because things have been done in a certain way till now does not make them right, or everything else a hack for that matter. The actual hack is when all the user needs from the server is a single number, and we are sending them a full page because that's the way we are used to doing things. Users have taken it upon themselves to fix this issue (by polling the window hash continuously), because there was no other way to detect URL state changes within the same page. Spec authors followed suit - http://www.whatwg.org/specs/web-apps/current-work/#event-hashchange
Anurag
Hashs were mentioned as a hack when you connect them with javascript to produce partial requests, just like flash does (or tried to do) to maintain navigation in the browser. I take you are upset because you misunderstood that, linking the specification only proves that, an anchor is to access a portion of the page. Once again, my solution is not to load all the content for all the 'subpages' at once, its to have every content on its standalone page, and make use of browser cache for the external resources, if it was not clear enough. TLDR: A page per content, hard as that.
F.Aquino
what is a page? what constitutes content? is it atomic? is the number that showed (live) donations on the wikipedia a piece of content? does it need to be on its own page? do users have to always go to that page to see that number? can't it be part of some other page? if so, should we periodically do a full refresh of the page to update that number? i think you're still stuck on AJAX being a buzzword, and are unable to think beyond the page mentality that we started with in the 90s.
Anurag
@Anurag: sigh, go insult someone else. Never said ajax is a buzzword, never said I don't use it (using right now), just because you CAN use it does not fit the every scenario, have you even read the OP? Have you read the OP? Did you read? See I can spam questions too.
F.Aquino
@F.Aquino I've read the question and I think the OP made his intentions clear in the first sentence itself. It's unwise to assume that we know what's best for other people, and they don't. I wasn't trying to spam you with questions. They all logically lead up to the point I wanted to make - partial requests (and hash based navigation) is not a hack. Think from an end users perspective, not pureness of the backend code. Google Maps vs the old Mapquest. Now, wouldn't it be nice if things like bookmarking also worked with that fancy Ajaxy interface? (a by-product of handling hashes well)
Anurag
I'm the original poster :) And I undertand how this might be a difficult topic, especially debate worthy!The best solution is obviously based on what you really need. I agree with both of you, just because you CAN do some something fancy doesn't mean it's the best solution. But in this case, I'm doing a site where users log in and see their own content, as well as others. And I'm looking for the smartest framework to program it in. Should I use several php files that in them selves are dynamic? Stuff like that. I'm thinking 90's now I guess. Not updated on 2010 ajax and serverside contents
Kenny Bones
+2  A: 

If there is too much content, pre-loading it all will be a really slow for the end users. I'd suggest you go with the second approach of dynamically requesting content when a link is clicked.

You can easily simplify the design by hijacking all links and making an ajax call to fetch the content and inject it into the relevant container. Assuming each link fetches content from a different URL, and that content gets inserted into a different container depending on which link was clicked, you'd just have to assign each such AJAX'd link a few attributes:

Define an attribute data-remote on each link that could load content with AJAX.

Define another attribute data-container that specifies the container's id where the result is to be inserted in.

// this is the ajaxy link
<a id="test" data-remote="true" data-container="container-id" href="..">Load</a>

Finally, apply a click handler to all links that have the data-remote attribute set:

$('a[data-remote]').click(function() {
    var containerId = this.attr('data-container');
    var url = this.attr('href');
    $('#' + containerId).load(url);

    return false; // stop from navigating to the clicked link
});
Anurag
This is really good advice. Because we need a system based on urls. And that should also dictate how the content is loaded. The entire framework needs to be based on this solution actually. Again, much like Facebook :) Which btw is awesomely coded!
Kenny Bones
Facebook has some amazing engineers under their belt. If you're making something like a single-page rich internet application, you may also want to checkout some of the practices Google used in Google Web Toolkit (GWT). Wave was written in GWT.
Anurag