tags:

views:

31

answers:

2

I've read some related articles (like http://stackoverflow.com/questions/2907109/making-javascript-generated-content-possible-for-search-engines-to-index), but what I'd like to know, is there a simpler option to embed content from another site? Without the use of iFrames.

What I'd like to achieve in the end is to create some sort of repository for content and serve that to different sites/clients.

For instance (and this is pseudo-coded):

<dl><dt>Date of birth</dt><dd><span src="http://myserver.com/get.aspx?value=dob&amp;userid=102" /></dd></dl>

where the span src is ofcourse not valid or working, but I'd like something similar. First, and foremost, it should be "codable" for non-technical users and second it should be indexable by search spiders.

Now the question: is there something for this?

EDIT: The sites who need to "recieve" this data I keep aren't mine. Like I've said in a comment Facebook being the worst example I can choose but the principle remains: I'd like to create 1 source of information which you keep at my server and let other party's feed from this content so you'll only need to update some generic information only once.

A: 

Now the question: is there something for this?

Only using a server-side language like PHP, or using Server Side Includes.

The downside to these methods is that the rendering of your page becomes dependent on the remote page's availability and rendering speed. If the remote page goes down, so does yours.

Therefore, some kind of caching should be used when including 3rd party content from server side... And it gets complicated then, as well, so it doesn't match the simpler solution you are looking for.

I know iframes have their disadvantages, but if you can live with them, they are still the simplest way of doing this.

Pekka
Definitely the easy route, w.r.t. getting the content indexed. Although if the documents are in XML a bit of XSLT (one line at the top of the XML source) and a custom element which is processed by the XSLT to do a XInclude might work as well when combined with a robots.txt file.
It's just that.. The sites aren't mine. I'd like to create 1 source for others to feed on. For instance: a personal profile on facebook.. Read the data from my site just to get 1 repository for all "same-like" data. (Facebook being the worst example I can think off btw).
riffnl
A: 

um... How simple are you looking for? I mean... In your example, if you change "SPAN" to "IFRAME" you'll have working code.

I think the real problem is how get it indexable by seach spiders, but that request basically translates to "How can my site get credit for other people's work --- with absolutely no effort on my part...."

James Curran
iframes have one big problem, they don't resize dynamically. Also, there *are* good arguments for embedding remote code physically into your page, e.g. the ability to use CSS for styling, or Javascript for interaction.
Pekka
Indeed the same problem as Pekka said; no dynamic resizing and all.And besides that: "How can my site get credit for other people's work --- with absolutely no effort on my part...." is just the other way around.I want to create one site to support multiple other sites (like Facebook for instance) where you only need to edit 1 site (mine) and span the content to others.
riffnl