tags:

views:

47

answers:

1

More specifically large XML webpages (RSS Feeds). I am using the excellent Rome library to parse them, but the page I am currently trying to get is really large and Java runs out of memory before getting the whole document.

How can I split up the webpage so that I can pass it to XMLReader? Should I just do it myself and pass the feeds in parts after adding my own XML to start and finish them?

A: 

First off learn to set the java command line options for Xms and Xmx to appropriate values, all the DOM based parsers each crap loads of memory. Second look at using a Pull Parser, it will not have to load the entire XML into a document before processing it.

fuzzy lollipop