views:

32

answers:

1

I want to get individual posts from a blogger blog, and make them into individual classes and add their content to my website. I need to do this because the hardware that I'm hosting my website on has very little processing power (pentium 3), and very little ram (512 mb), and if I were to just put a wordpress blog on it, response time would be extremely slow, even through a reverse proxy such as lighttpd or nginx.

So, so far I know that I need to call jQuery.ajax() and point that to the atom feed of the blogger blog, but I'm pretty lost after that. How do I separate the xml data after getting it, into individual blog posts/classes, and possibly load images that would be posted in these blog posts?

A: 

Here is an example of how to process an Atom feed. In this example I am fetching a local XML feed file. In real world you will need a simple proxy script to fetch it for you as you cannon make cross domain XML requests. In a nutshell to process any XML using jQuery you just loop through a collection of nodes using their "tag" names and grab their content which you can later re-purpose as you see fit...

In this case I am processing a feed which contains title and content tags...for summary feeds you might need to include a summary tag processing

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd"&gt;
<html>
    <head>
        <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js" type="text/javascript">
        </script>
        <script>
            //This example shows getting a local ATOM file. I am assuming that you will be using a proxy to fetch the feed as you 
            //are getting it from a remote source

            //get the feed
            $.get("feed.xml", function(data){

                //if XML loaded successfully find all blog entries
                html = "";
                $(data).find("entry").each(function(){

                    //get text for title and the content 
                    title = $(this).find("title").text();

                    content = $(this).find("content").text()

                    //create your own html
                    html += "<h1>" + title + "</h1>";
                    html += "<div class='blogEntry'>" + content + "</div>"

                })
                //append html to the container of yor choice
                $(".blogClone").append(html)
            })

        </script>
        <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
        <title>Untitled Document</title>
    </head>
    <body>
        <div class="blogClone">
        </div>
    </body>
</html>

If you are using PHP on your server this is a simple proxy script that you will need

<?php
// PHP Proxy
// Responds to both HTTP GET and POST requests
//
// Author: Abdul Qabiz
// March 31st, 2006
//

// Get the url of to be proxied
// Is it a POST or a GET?
$url = ($_POST['url']) ? $_POST['url'] : $_GET['url'];
$headers = ($_POST['headers']) ? $_POST['headers'] : $_GET['headers'];
$mimeType = ($_POST['mimeType']) ? $_POST['mimeType'] : $_GET['mimeType'];

//Start the Curl session
$session = curl_init($url);

// If it's a POST, put the POST data in the body
if ($_POST['url']) {
    $postvars = '';
    while ($element = current($_POST)) {
        $postvars .= key($_POST).'='.$element.'&';
        next($_POST);
    }

    curl_setopt($session, CURLOPT_POST, true);
    curl_setopt($session, CURLOPT_POSTFIELDS, $postvars);
}

// Don't return HTTP headers. Do return the contents of the call
curl_setopt($session, CURLOPT_HEADER, ($headers == "true") ? true : false);

curl_setopt($session, CURLOPT_FOLLOWLOCATION, true);
//curl_setopt($ch, CURLOPT_TIMEOUT, 4);
curl_setopt($session, CURLOPT_RETURNTRANSFER, true);

// Make the call
$response = curl_exec($session);

if ($mimeType != "") {
    // The web service returns XML. Set the Content-Type appropriately
    header("Content-Type: ".$mimeType);
}

echo $response;

curl_close($session);

?>
Michal
If I were to use the proxy you describe, I would add in a URL check to prevent hackers from using your server to hack other web sites, avoiding trouble with your ISP.
idealmachine
That would be wise
Michal
It seems like running a php proxy script is the only way to go about getting the atom feed from a different domain. Thanks, and I'll look into this too then.
ertemplin