Hi,
I am doing a project, for which I need to know all the wikipedia article names(I don't need the content). Is there a place where I can download this data.
Thank you
Bala
Hi,
I am doing a project, for which I need to know all the wikipedia article names(I don't need the content). Is there a place where I can download this data.
Thank you
Bala
Check out this page here on Wikipedia - there is an option to just download an archive with the names of the articles. Here's the actual path to the download page:
Edit:
You may notice non-English titles appearing in the list (and some profanity - be advised) contained in enwiki-latest-all-titles-in-ns0.gz
. This is because by default most people create content on the main English wiki (language code en
). If you were to investigate other language dumps you will observe there are different sets of articles.
Reading on the main download page, there are references to being able to use the Wikipedia API to perform some types of querying on Wikipedia, but I'm not sure this will resolve your problem (taxonomy of the pages doesn't seem to provide a simple way to differentiate "English" content vs "content on English wiki").
I'm not aware of any central list of articles, but if you just need a large number of them rather than a complete list (bearing in mind that any complete list will always be out of date anyway) then you could probably put something together with wget to recursively follow links within wikipedia from the main page and store the URLs you get.