tags:

views:

71

answers:

2

any one knows a way to get all the URLs in a website using javascript?i only need the links starting with the same domain name.no need to consider other links

A: 

using jquery u can find all the links on the page that match a specific criteria

$("a[href=^domain.com]").each(function(){
      alert($(this).attr("href"));
});
Muhammad Adeel Zahid
+6  A: 

Well this will get all the same-host links on the page:

var urls= [];
for (var i= document.links.length; i-->0;)
    if (document.links[i].hostname===location.hostname)
        urls.push(document.links[i].href);

If by site you mean you want to recursively get the links inside linked pages, that's a bit trickier. You'd have to download each link into a new document (for example in an <iframe>), and the onload check the iframe's own document for more links to add to the list to fetch. You'd need to keep a lookup of what URLs you'd already spidered to avoid fetching the same document twice. It probably wouldn't be very fast.

bobince
+1 Even though you aren't using any `{}` :P
Josh Stodola
Yeah, I couldn't decide whether to use a object like a set or just an array. You'd definitely want the set-like once you start crawling multiple pages and need to look up whether the URL had been done before.
bobince
+1 for not using jQuery
Andris
so then that means i have to load each page in the website right?and by the way how can i not show those loading pages to the user?
netha
Yes. You can load the pages into a hidden `<iframe>`.
bobince