Before doing the block processing, you can use the MEMORY function to see how much memory is already being used and how much is left available for any additional variables the block processing may need to create. If you can estimate the total amount of memory the block processing steps will need as a function of the block size, you can figure out how large the block size can be before you run out of available memory. This may be easier said than done, since I don't know exactly how you are doing the block processing.
Here's a simple example. I'll start by clearing the workspace and creating 2 large matrices:
>> clear all
>> mat1 = zeros(8000); %# An 8000-by-8000 matrix of doubles
>> mat2 = zeros(8000); %# Another 8000-by-8000 matrix of doubles
Now, let's say I know I will have to allocate an N-by-N
matrix of doubles, which will require 8*N*N
bytes of memory (8 bytes per double). I can do the following to find out how large I can make N
:
>> uV = memory %# Get the memory statistics
uV =
MaxPossibleArrayBytes: 314990592
MemAvailableAllArrays: 643969024
MemUsedMATLAB: 1.2628e+009
>> maxN = floor(sqrt(uV.MaxPossibleArrayBytes/8)) %# Compute the maximum N
maxN =
6274
>> mat3 = ones(maxN); %# Works fine
>> mat3 = ones(maxN+1); %# Tanks! Too large!
??? Out of memory. Type HELP MEMORY for your options.
If you are routinely having trouble with running out of memory, here are a couple of things you can do:
- Use single precision (or integer types) for large matrices instead of the default double precision.
- Be sure to clear variables you don't need anymore (especially if they are large).