tags:

views:

417

answers:

10

right now, i need to load huge data from database into a vector, but when i loaded 38000 rows of data, the program throw out OutOfMemoryError exception. What can i do to handle this ?

I think there may be some memory leak in my program, good methods to detect it ?thanks

A: 

Let your program use more memory or much better rethink the strategy. Do you really need so much data in the memory?

Boris Pavlović
@JoshJordan tx for the edit
Boris Pavlović
+1  A: 

You can try increasing the heap size:

 java -Xms<initial heap size> -Xmx<maximum heap size>

Default is

java -Xms32m -Xmx128m
Tommy
Maximum would be 1024m, depending on your platform (maximum allocable memory chunk it seems).
Mercer Traieste
He said that the program threw OutOfMemoryError after 38000 rows of data, I think that there are more, imho he cannot continue increasing the heap size to keep up.
Alberto Zaccagni
As far as I know there is no maximum. I run with 4GB heap.
Tommy
True. If he'll still use the brute force approach, he should also know the limits.
Mercer Traieste
@Tommy, yes, that's what I've read. But on my machine (win xp 32 bits with 3.2 GB - 4 GB installed) I've only managed to set Xmx to ~1024m.
Mercer Traieste
@Tommy Ok, I might believe that there is no maximum, but do you agree that just increasing the heap every time to have this huge object is not a clean solution?
Alberto Zaccagni
@Tommy. There is always a maximum. For example, there is a limit to the number / size of memory sticks you can stuff into your motherboard.
Stephen C
+7  A: 

Provide more memory to your JVM (usually using -Xmx/-Xms) or don't load all the data into memory.

For many operations on huge amounts of data there are algorithms which don't need access to all of it at once. One class of such algorithms are divide and conquer algorithms.

Joachim Sauer
+1  A: 

Do you really need to have such a large object stored in memory?

Depending of what you have to do with that data you might want to split it in lesser chunks.

Alberto Zaccagni
+1  A: 

Load the data section by section. This will not let you work on all data at the same time, but you won't have to change the memory provided to the JVM.

xav0989
+1  A: 

Maybe optimize your data classes? I've seen a case someone has been using Strings in place of native datatypes such as int or double for every class member that gave an OutOfMemoryError when storing a relatively small amount of data objects in memory. Take a look that you aren't duplicating your objects. And, of course, increase the heap size:

java -Xmx512M (or whatever you deem necessary)

Daniil
+2  A: 

If you must have all the data in memory, try caching commonly appearing objects. For example, if you are looking at employee records and they all have a job title, use a HashMap when loading the data and reuse the job titles already found. This can dramatically lower the amount of memory you're using.

Also, before you do anything, use a profiler to see where memory is being wasted, and to check if things that can be garbage collected have no references floating around. Again, String is a common example, since if for example you're using the first 10 chars of a 2000 char string, and you have used substring instead of allocating a new String, what you actually have is a reference to a char[2000] array, with two indices pointing at 0 and 10. Again, a huge memory waster.

+1  A: 

You could run your code using a profiler to understand how and why the memory is being eaten up. Debug your way through the loop and watch what is being instantiated. There are any number of them; JProfiler, Java Memory Profiler, see the list of profilers here, and so forth.

Alex Feinman
A: 

I know you are trying to read the data into vector - otherwise, if you where trying to display them, I would have suggested you use NatTable. It is designed for reading huge amount of data into a table.

I believe it might come in handy for another reader here.

Helen Neely
A: 

Use a memory mapped file. Memory mapped files can basically grow as big as you want, without hitting the heap. It does require that you encode your data in a decoding-friendly way. (Like, it would make sense to reserve a fixed size for every row in your data, in order to quickly skip a number of rows.)

Preon allows you deal with that easily. It's a framework that aims to do to binary encoded data what Hibernate has done for relational databases, and JAXB/XStream/XmlBeans to XML.

Wilfred Springer