I have a URL of a remote page from a different domain which I have to download, parse, and update DOM of the current page. I've found examples of doing this using new ActiveXObject("Msxml2.XMLHTTP"), but that's limited to IE, I guess, and using new java.net.URL, but I don't want to use Java. Are there any alternatives?
The XMLHTTPRequest object is common to most modern browsers and is what powers AJAX web applications.
Same domain policy is going to get you.
1) Proxy through your server. browser->your server->their server->your server->browser.
2) Use flash or silverlight. The 3rd party has to give you access. The bridge between javascript and flash isn't great for large amounts of data and there are bugs. Silverlight isn't ubiquitous like flash...
3) use a tag. This really isn't safe... Only works if 3rd party content is valid javascript.
Whats about load an PHP Script via AJAX which does file_get_contents()
? This should work for different domain. If i understand correct.
Writing a server-side script that will retrieve the page's content for you is the way to go. You can use the XMLHttpRequest object to make an AJAX call to that script, which will just put through all html (?) for you.
Still, I advise against it. I don't know exactly how much you trust the other site, but the same origin policy exists for a reason. What is it exactly you are trying to do? Usually, there is a workaround.
I dont think you can do this according to the constraints of same origin policy. Two communicate between two domains using Iframes also we can use JS code but both domains need to have communicating code in them. The Child frame can contact the grandparent frame (window) but not here.
Since you are referring to some other url all togeather.
The only way is to do it using your server side code to access the content on the other domain.
Just use PHP:
<?php
$url = "http://www.domaintoretrieve.com";
ob_start();
include_once( $url );
$html = ob_get_contents();
ob_end_clean();
?>
$html contains the entire page to manipulate as needed.