views:

63

answers:

1

This question is related to these two:
http://stackoverflow.com/questions/2867901/introduction-to-vectorizing-in-matlab-any-good-tutorials
http://stackoverflow.com/questions/2561617/filter-that-uses-elements-from-two-arrays-at-the-same-time

Basing on the tutorials I read, I was trying to vectorize some procedure that takes really a lot of time.

I've rewritten this:

function B = bfltGray(A,w,sigma_r)
dim = size(A);
B = zeros(dim);
for i = 1:dim(1)
    for j = 1:dim(2)

        % Extract local region.
        iMin = max(i-w,1);
        iMax = min(i+w,dim(1));
        jMin = max(j-w,1);
        jMax = min(j+w,dim(2));
        I = A(iMin:iMax,jMin:jMax);

        % Compute Gaussian intensity weights.
        F = exp(-0.5*(abs(I-A(i,j))/sigma_r).^2);
        B(i,j) = sum(F(:).*I(:))/sum(F(:));

    end
end

into this:

function B = rngVect(A, w, sigma)
W = 2*w+1;
I = padarray(A, [w,w],'symmetric');
I = im2col(I, [W,W]);
H = exp(-0.5*(abs(I-repmat(A(:)', size(I,1),1))/sigma).^2);
B = reshape(sum(H.*I,1)./sum(H,1), size(A, 1), []);

Where
A is a matrix 512x512
w is half of the window size, usually equal 5
sigma is a parameter in range [0 1] (usually one of: 0.1, 0.2 or 0.3)
So the I matrix would have 512x512x121 = 31719424 elements

But this version seems to be as slow as the first one, but in addition it uses a lot of memory and sometimes causes memory problems.

I suppose I've made something wrong. Probably some logic mistake regarding vectorizing. Well, in fact I'm not surprised - this method creates really big matrices and probably the computations are proportionally longer.

I have also tried to write it using nlfilter (similar to the second solution given by Jonas) but it seems to be hard since I use Matlab 6.5 (R13) (there are no sophisticated function handles available).

So once again, I'm asking not for ready solution, but for some ideas that would help me to solve this in reasonable time. Maybe you will point me what I did wrong.

Edit:
As Mikhail suggested, the results of profiling are as follows:
65% of time was spent in the line H= exp(...)
25% of time was used by im2col

+1  A: 

How big are I and H (i.e. numel(I)*8 bytes)? If you start paging, then the performance of your second solution is going to be affected very badly.

To test whether you really have a problem due to too large arrays, you can try and measure the speed of the calculation using tic and toc for arrays A of increasing size. If the execution time increases faster than by the square of the size of A, or if the execution time jumps at some size of A, you can try and split the padded I into a number of sub-arrays and perform the calculations like that.

Otherwise, I don't see any obvious places where you could be losing lots of time. Well, maybe you could skip the reshape, by replacing B with A in your function (saves a little memory as well), and writing A(:) = sum(H.*I,1)./sum(H,1);

You may also want to look into upgrading to a more recent version of Matlab - they've worked hard on improving performance.

Jonas
Jonas, I updated my question. As I wrote, generated matrices are getting really big and I'm affraid my solution is not the best for this kind of problem.
Gacek
Oh, I only saw this now. I have updated my answer as well for how you may be able to find out whether it is possible to speed up the calculations.
Jonas
The `I` matrix having 31 million elements is big, but not huge... I would second the call to see if smaller sizes of I have much much better performance. If so, it may be that Matlab is having a lot of trouble finding a continuous 64 MB chunk of memory. Is this an old computer? Try rebooting immediately before running the program, memory is less fragmented at boot. Or get more RAM?
rescdsk

related questions