any one knows a way to get all the URLs in a website using javascript?i only need the links starting with the same domain name.no need to consider other links
A:
using jquery u can find all the links on the page that match a specific criteria
$("a[href=^domain.com]").each(function(){
alert($(this).attr("href"));
});
Muhammad Adeel Zahid
2010-09-29 17:30:47
+6
A:
Well this will get all the same-host links on the page:
var urls= [];
for (var i= document.links.length; i-->0;)
if (document.links[i].hostname===location.hostname)
urls.push(document.links[i].href);
If by site you mean you want to recursively get the links inside linked pages, that's a bit trickier. You'd have to download each link into a new document (for example in an <iframe>
), and the onload
check the iframe's own document for more links to add to the list to fetch. You'd need to keep a lookup of what URLs you'd already spidered to avoid fetching the same document twice. It probably wouldn't be very fast.
bobince
2010-09-29 17:31:42
+1 Even though you aren't using any `{}` :P
Josh Stodola
2010-09-29 17:34:48
Yeah, I couldn't decide whether to use a object like a set or just an array. You'd definitely want the set-like once you start crawling multiple pages and need to look up whether the URL had been done before.
bobince
2010-09-29 18:09:25
+1 for not using jQuery
Andris
2010-09-29 19:26:20
so then that means i have to load each page in the website right?and by the way how can i not show those loading pages to the user?
netha
2010-10-01 03:09:56
Yes. You can load the pages into a hidden `<iframe>`.
bobince
2010-10-01 11:27:31