views:

255

answers:

2

I'm trying to train an ANN (I use this library: http://leenissen.dk/fann/ ) and the results are somewhat puzzling - basically if I run the trained network on the same data used for training, the output is not what specified in the training set, but some random number.

For example, the first entry in the training file is something like

88.757004 88.757004 104.487999 138.156006 100.556000 86.309998 86.788002 
1

with the first line being the input values and the second line is the desired output neuron's value. But when I feed the exact same data to the trained network, I get different results on each train attempt, and they are quite different from 1, e.g.:

Max epochs   500000. Desired error: 0.0010000000.
Epochs            1. Current error: 0.0686412785. Bit fail 24.
Epochs          842. Current error: 0.0008697828. Bit fail 0.
my test result -4052122560819626000.000000

and then on another attempt:

Max epochs   500000. Desired error: 0.0010000000.
Epochs            1. Current error: 0.0610717005. Bit fail 24.
Epochs          472. Current error: 0.0009952184. Bit fail 0.
my test result -0.001642

I realize that the training set size may be inadequate (I only have about a 100 input/output pairs so far), but shouldn't at least the training data trigger the right output value? The same code works fine for the "getting started" XOR function described at the FANN's website (I've already used up my 1 link limit)

+3  A: 

Short answer: No

Longer answer (but possibly not the as correct):

1st: a training run only moves the weights of the neurons towards a position where they affect the output to be as in the testdata. After some/many iterations the output should be close to the expected output. Iff the neurol network is up to the task, which brings me to

2nd: Not every neuronal network works for every problem. For a single neuron it is pretty easy to come up with a simple function that can not get approximated by a single neuron. Though not as easy to see, the same limit applies for every neural network. In such cases your results will very likely look like random numbers. Edit after comment: In many cases this can be fixed by adding neurons to the network.

3rd: actually the first point is a strength of a neural network, because it allows the network to handle outliers nicely.

4th: I blame 3 for my lacking understanding of music. It just doesn't fit my brain ;-)

Jens Schauder
Thank you! Now I at least have some direction.
7macaw
RE your second point: depends on how you define 'approximate'. If you vary the number of internal neurons, you can approximate functions better (kinda like adding terms to approximate a Taylor series).
Jeremy Powell
+2  A: 

No, if you get your ANN to work perfectly on the training data, you either have a really easy problem or you're overfitting.

dsimcha