Hi.
Well, we have a web app, running over JBoss and we're having an "OutOfMemory" error when trying to insert a lot of rows in several tables of a postgres DB.
This is the complete environment for this error:
* JBoss 4.3.x GA
* Java 1.6.0
* Hibernate 3.0
* postgreSQL-8.3 (driver)
About actual code-work environment:
* The heavy part about this is that we're parsing huge amounts of xml documents each one downloaded separately from a specific URL (1 URL = 1 XML). We accomplish that by having an EJB that distributes the generated URLs to a queue, then a pool of MDBs connect using streams and generates the documents (note that we've actually had to raise stack memory due to XML documents size, and we're stuck with having to get all the document in one stream), once the document is generated it goes to another queue where another MDB pool listens.
Those MDBs parse the doc, storing information in several entities (5 at least) that then are persisted in the DB (note that the transaction management is set to "BEAN" and is begun and commited during each MDBs work). Processing URLs sequentially is not an option because of the amount of URLs to be processed, it would take like 2 months or so... lol
Trouble is... that we parse and store like 200 URLs or so and start getting out of memory error for postgreSQL. Any ideas??
Thanks in advance!!
ALSO: It may be of use knowing that this error wasn't coming out before (I did parse a few thousands of that XML befor), only generating documents and parsing some of it into some entities didn't seem to bring trouble. Troubles started when we started to parse more and more of the doc into it's correspondent entities. (Like one entity having a list of "features" [other entity parsed from the same xml])