tags:

views:

178

answers:

2

Hi, I am trying to run a NN learner on same data in 2 different programs. Although everything (the data and parameters) are same and also I manually set initial weights to same value (0), I result in different weights in 2 programs. Interesting thing is running each program consecutively doesn't change independent results. What I debug and check is:

1) Input data,

2) NN Parameters

3) Init. weights

4)Class labels (output nodes)

and these are all same. What else should I check?

+1  A: 

I think what is happening is that the NN training algorithm is setting up some seed randomicaly and because of that, your weight matrix will end up with different values in both your trainings.

Second, the resulting classification in your neural network will be practically the same in both programs as long as you had used the same training set.

Andres
A: 

As answer 1 said, it's probably initializing the weights to random values. In general, you should not initialize NN weights to zero, because that is always a saddle-point. That means that a typical backpropagation training algorithm will fail.

Jive Dadson