views:

54

answers:

1

Intro:
I am a BI addict and would like to develop a project to drill-down Wikipedia's data.
I would write scripts to extract data from dbpedia (probably beginning by people articles) and load it into a people table.

My question is:
Has anyone done this before? Even better, is there a community dedicated to this?
If it the scripts are somewhere, I would rather contribute to them than rewrite them.

Just an example:
In the OLAP cube of people, I can drill-down by first name, choose drill-through "Remi", check in which areas this name is used, then for all areas drill-down on gender to check where this name is popular for girls and where it is popular for boys. For each of them, I can then drill-down through time to see the trends. You can not do this kind of investigation without a BI tool, or it will take days instead of seconds.

+1  A: 

Check out Mahout which is a distributed machine learning library. One of the examples there uses a dump of wikipedia

https://cwiki.apache.org/MAHOUT/wikipedia-bayes-example.html http://mahout.apache.org

I'm not familiar with the exact details of business intelligence, however machine learning is about finding relevant patterns and clustering of information together. At the very least this should give an example of loading wiki into memory and doing some simple and not so simple things with the data.

steve
If I have to load Wikipedia data, I will do it through dbpedia. They did all of the parsing work, and provide convenient datasets that are ready to use. See preview at http://downloads.dbpedia.org/preview.php?file=3.5.1_sl_en_sl_persondata_en.nt.bz2
Nicolas Raoul