Haven't seen any library for that, but looks a pretty simple thing. I've jot down a quick function which can help you out. I have kept it simple, you might want to use cURL to fetch the content, put some error handling, etc.
Anyway, here is my two cents:
<?php
function getLinkInfo($url)
{
    // Get target link html
    $html = file_get_contents($url);
    // Prepare the DOM document
    $dom = new DOMDocument();
    $dom->loadHTML($html);
    $dom->preserveWhiteSpace = false; 
    // Get page title
    $titles = $dom->getElementsByTagname('title');
    foreach ($titles as $title) {
        $linkTitle = $title->nodeValue;
    }
    // Get META tags
    $metas = $dom->getElementsByTagname('meta'); 
    // We only need description
    foreach ($metas as $meta) {
        if ($meta->getAttribute("name") == "description") {
            $linkDesc = $meta->getAttribute("content");
        }
    }
    // Get all images
    $imgs = $dom->getElementsByTagname('img'); 
    // Again, we need the first one only
    foreach ($imgs as $img) {
        $firstImage = $img->getAttribute("src");
        if (strpos("http://", $firstImage) === false) {
            $firstImage = $url . $firstImage; 
        }
        break;
    }
    $output = <<<HTML
    <div class="info">
        <div class="image"><img src="{$firstImage}" alt="{$linkTitle}" /></div>
        <div class="desc">
            <div class="title">{$linkTitle}</div>
            <div class="subtitle">{$url}</div>
            <div class="summary">{$linkDesc}</div>
        </div>
    </div>
HTML;
    return $output;
}
echo getLinkInfo("http://www.phpfour.com/");
                  phpfour
                   2009-12-08 20:02:10