views:

72

answers:

2

I have a yes/no classification problem, where false positives are worse than false negatives.

Is there a way to implement this fact into neural network especially in MATLAB's Neural Network Toolbox?

+4  A: 

What you need is a cost-sensitive meta-classifier (a meta-classifier works with any arbitrary classifier, be it ANN, SVM, or any other).

This can be done in two ways:

  • re-weighting training instances according to a cost matrix. This is done by resampling the data so that a particular class is over represented, thus the model built is more sensitive to that particular class as opposed to the other classes.
  • predicting the class with minimum expected misclassification cost (rather than the most likely class). The idea here is to minimize the total expected costs by making cheap mistakes more often and expensive mistakes less often.

One algorithm that implements the first learning approach is SECOC, which uses error-correcting codes; while an example of the second approach is the MetaCost which uses bagging to improve the probability estimates of the classifier.

Amro
+1  A: 

You can use a custom cost function. This is what I did recently:

cost(true negative) = 0
cost(true positive) = 0
cost(false positive) = infinity
cost(true negative) = L

This can be accomplished e.g. by this formula:

cost(y, t) = (1 - t) log (1 - y) - L * t * (1 - y)

This implies some deriving and implementing of course and is not out of the Matlab toolbox.

bayer

related questions