views:

412

answers:

3

Dears. I am using Matlab 2009b and having an out of memory error. I read other posted sol but they are not useful for me. I am sure that i am doing things right but i must use very huge amount of array sizes. I think that the problem lies beyond the fact that Matlab does not enable an array to be in more than one OS block. I am using Windows 7. Is there a way to get rid of this problem. For example can i increase the array block that Matlab uses in Windows 7?
Thanks...Hani Almousli

+2  A: 

If the largest available block (as shown by memory) is much smaller than the maximum amount of memory available to Matlab, a restart of Matlab (or the system) can help.

Otherwise, you need to either rewrite your code or buy more RAM (and/or use the 64-bit version of Win7).

I suggest you try rewriting your code. It is very often possible to work around memory issues.

EDIT

From your comment on @Richie Cotton's post, I see that you want to do classification an a huge amount of data. If the are a small number of classes, none of which are very sparse, you can solve the problem by running kmeans on, say, 10 randomly chosen subsets of, say, 30% of your data each. This should find you the centers of the clusters just fine. To associate your data with the kernels, all you have to do is calculate, for each data point, the distance to the cluster centers and associate them with the closest center.

Jonas
+2  A: 

If you think your array sizes are not big enough to warrant such an error, maybe your previous operations fragmented available memory. MATLAB requires contiguous blocks so fragmentation can lead to such errors.

So before the point in your code where an out of memory error occurs, try running the pack command. That's all I can think of apart from the usual fixes.

Jacob
+1  A: 

EDIT: MathWorks give advice on this problem.


You can view memory usage with the commands system_dependent memstats and system_dependent dumpmem (as well as simply memory, as noted by Jonas).

The command pack (which in effect defragments your workspace) may also come in useful.

If you are dealing with objects containing > 10 million or so values, then memory can easily become an issue. Throwing hardware at the problem (i.e. buying more RAM) may be an option, but there is a limit to what you can achieve.

The way I suggest you approach recoding things to make them more memory efficient is:

See if there are any variables you don't need to allocate. A classic example of this is when a function returns a value of the same size as it's input.

function x = XPlus1(x)
x = x + 1;
end

is more memory efficient than

function y = XPlus1(x)
y = x + 1;
end

Next, try and split your problem down into small chunks. At the simplest level, this may involve performing an operation on rows instead of a whole matrix, or on individual elements instead of a vector. (The cost of looping is less than the cost of it not running at all due to memory contraints.) Then you have to reconstruct your answer from the pieces.

This step is essentially the philosophy behing map-reduce, so as a bonus, your code will be more easily parallelizable.

Richie Cotton
I think non of these solution are effective. One of the problems i want to solve is to find covariance matrix for the huge matrix. I also want to use k-means algorithm for the huge data array. In both of the previous problems i think I can not use any of the suggested solutions. Thanks very much for your interests.
Hani
@Hani: For k-means, there can be a simple solution that may take a bit longer, but that will get the job done on your large array. See my edit.
Jonas

related questions