views:

229

answers:

2

Ok guys I've downloaded the wikipedia xml dump and its a whopping 12 GB of data :\ for one table and I wanted to import it into mysql databse on my localhost - however its a humongous file 12GB and obviously navicats taking its sweet time in importing it or its more likely its hanged :(.

Is there a way to include this dump or atleast partially at most you know bit by bit.


Let me correct that its 21 GB of data - not that it helps :\ - does any one have any idea of importing humongous files like this into MySQL database.

+1  A: 

Take a look into Sax parser it allows you to read in the corpus piece by piece rather than reading the whole 12gb into memory. I'm not too sure how you would interface it with mysql though.

PCBEEF
+1  A: 

Use the command line instead, navicat is horrible for importing large files and will likely take 20x longer than using the CLI.

Ian