tags:

views:

83

answers:

1

Hello,

[Edit: The whole thing has a very simple solution: the matrix used the single datatype instead of the default double]

I have just noticed a somewhat peculiar (I think) behaviour in matlab and wonder what's causing it. I have a 10000x500 matrix M with values ranging from

min(min(M)) = -226.9723 to 
max(max(M)) =  92.8173

and

exp(-227) =  2.6011e-99
exp(93) = 2.4512e+40

but if I exp the entire matrix, this matrix has inf values:

ii = isinf(exp(M));
sum(sum(ii))
ans =
     2

How does Matlab store the values in the matrix so that operations on individual elements can give a different result than when doing the same operation on the matrix itself?

I.e.

expM = exp(M);
exp(M(1)) == expM(1) ; %can be false, which I find surprising

I know I have to change the algorithm anyway as the high exponents will give inexact results even if I can avoid inf values. It happens in a formula for a artificial neural network calculation like:

 sum(log(1+exp(ones(numcases,1)*b_h + data*w_vh)),2);

so my plan is to split this up into two cases, first where the exponent is small I do the calculation as above, for high values I approximate

log(1+exp(ones(numcases,1)*b_h + data*w_vh)

with

ones(numcases,1)*b_h + data*w_vh

Does that sound reasonable? My reasoning of course is that

log(1+exp(x)) ≈ log(exp(x)) ≈ x, for large x

btw: is there a better way to get the maximum element of a matrix other than doing max twice as in max(max(M))?

+2  A: 

Ok, I found the error: my matrix was of type single, but when I copied the value in a new variable that would be a double with of course a different max value. I answer this myself here so the question won't stay unanswered. Thanks for the tips, I found the cause when trying to build a repro-case :)

Ben Schwehn
I can confirm that `exp(single(93))` results in `Inf`, while `exp(93)` does not.
gnovice
And it makes perfect sense now, but had me confused for quite some time now. Didn't even think about that the matrix might not be the "normal" matrix with double precision...
Ben Schwehn

related questions