views:

31

answers:

2

I'm curious as to how DownThemAll does this. Do they use Javascript?

How can I get a list of all of the urls in a website using Javascript?

+3  A: 

Links: document.links (href)

Images: document.images (src)

Newer method: document.getElementsByTagName('img')

Bookmarklet:

javascript:var x=function{var imgs = document.getElementsByTagName('img');var t = "";for (var i=0, n=imgs.length;i<n;i++) t += '<br><a href="'+imgs[i].src+'"><img src="'+imgs[i].src+'" width="100"></a>'; var w=window.open('','_blank'); w.document.write(t); w.close();}; x();

Now you can save the new page and there will be a dir full images

mplungjan
So document.getElementsByTagName('img') returns an array?
Serg
Ah now with your edit I see. Thanks for the help. :)
Serg
A: 
var list=[];
var a=document.getElementsByTagName('img');
for (var i=0,l=a.length;i<l;i++)
{
    if (/\.(jpg|gif|png|jpeg)$/im.test(a[i].getAttribute('src')))
    {
        list.push(a[i].getAttribute('src'));
    }
}

This code would generate list of picture URL's which are used in <img>

Bick
Why bother filtering if you only get img tags?
mplungjan
link to php-generated images.
Bick
And they would not download? If their mime type was image/something they would also be images... Perhaps more interesting would be to filter on size
mplungjan
usual, you dont want to save such a pictures) or you like something like captchas) in addition, using filter is great customizable to parse images from another tags or sources
Bick