On our site www.foo.com
we want to download and use http://feeds.foo.com/feed.xml
with Javascript. We'll obviously use Access-Control but for browsers that don't support it we are considering the following as a fallback:
On www.foo.com
, we set document.domain
, provide a callback function and load the feed into a (hidden) iframe
:
document.domain = 'foo.com';
function receive_data(data) {
// process data
};
var proxy = document.createElement('iframe');
proxy.src = 'http://feeds.foo.com/feed.xml';
document.body.appendChild(proxy);
On feeds.foo.com
, add an XSL to feed.xml
and use it to transform the feed into an html document that also sets document.domain
and calls the callback function in its parent with the feed data as json:
<?xml version="1.0"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
<xsl:template match="ROOT">
<html><body>
<script type="text/javascript">
document.domain = 'foo.com';
parent.receive_data([<xsl:apply-templates/>]);
</script>
</body></html>
</xsl:template>
<!-- templates that transform data into json objects go here -->
</xsl:stylesheet>
Is there a better way to load XML from feeds.foo.com and what are the ramifications of this iframe-proxy/xslt/jsonp trick? (..and in what cases will it fail?)
Remarks
- This does not work in Safari & Chrome but since both support Access-Control it's fine.
- We want little or no change to
feeds.foo.com
- We are aware of (but not interested in) server-side proxy solutions
- update: wrote about it