tags:

views:

1322

answers:

5

I'm using JDBC to get a large amount of data. The call completes successfully, but when resultSet.next() is called, I get the following error:

java.lang.OutOfMemoryError: allocLargeObjectOrArray - Object size: 15414016, Num elements: 7706998

I've attempted to increase the JVM memory size, but this does not fix the problem. I'm not sure this problem can even be addressed as I'm not using JDBC to access a database, rather, the system is accessing a BEA AquaLogic service through JDBC.

Has anyone run into this error?

A: 

Beware that until the first resultSet.next() call the results may not yet be read from the database or still be in another caching structure somewhere.

You should try limit your Select to return a sane amount of results and maybe repeat the call until there are no more results left if you need all the data.

Increasing the JVM memory size won't help until you cannot be sure that there is an absolute limit on the amount of data which will be returned by your JDBC call.

Furthermore, accessing any service through JDBC essentially boils down to using JDBC :)

Another (unlikely) possibility could be that there is a bug in the JDBC driver you're using. Try a different implementation if it is possible and check if the problem persists.

Kosi2801
I wouldn't doubt the JDBC driver being the culprit. I can not, however, try any other implementation as this is BEA specific. There are other ways at getting at the information I am looking for, but that would be a big back-step at this time.
Nick
I haven't really had time to figure out the issue. In the meantime, I seem to be able to get the data returned without any errors. I believe my case was an edge case anyway.Accepting this answer because it was the most robust.
Nick
A: 

You can try setting the setFetchSize(int rows) method on your statement.
But setFetchRows is only a hint, which means it may not be implemented.

Charlie
This won't work as only one row is being returned.
Nick
A: 

Try increasing the memory size to 1.2g e.g. -mx1200m or something just less than the physical memory of your machine. You may find it is reading more data at once than your think.

Peter Lawrey
I increased my memory so far that the JVM wouldn't start the application because it said I specified too high memory. My machine isn't the beefiest either.
Nick
A: 

First-- figure out if you really need to get that much data in memory at once. RDBMS's are good at aggregating/sorting/etc large data sets, and you should try to take advantage of that if possible.

If not (and you really, really do need that much data in working memory for some reason)... and bumping up the JVM's memory args doesn't raise the bar enough... look into an in-memory distributed caching solution like Coherence (COTS) or TerraCotta (open source).

ikelly
A: 

How many rows are you returning from the database? like kosi2801 I would suggest to only fetch a subset of the data, start with a reasonable number and then increase to find the threshold.

rafn