views:

66

answers:

2

Hi all,

I have a very simple linear classification problem,which is to work out a linear classification problem for the following three classes in coordinates:

Class 1: points (0,1) (1,0) Class 2: points (-1,0) (1,0) Class 3: points (0,-1) (1,-1)

I manually used a random initial weight [ 1 0,0 1] (2*2 matrix) and a random initial bias [1,1] by applying each iteration on the six samples,I finally get a classification which is X=-1 and Y=-1,so when x and Y are both >-1,it is class1; if X<=-1 and Y>-1,it is class2; if x >-1 and Y <=-1,it is class3.

After plotting this on the graph,I think it has some problems since the decision boundary cross samples in class2 and class3,I wonder if that is acceptable.By observing the graph,I would say the ideal classification would be x =-1/2 and y=1/2,but I really cannot get that result after calculation.

Please kindly share your thoughts with me,thanks in advance.

A: 

The results are not acceptable. Class 2 and 3 are linearly separable, so you shouldn't accept any classifier that doesn't classify them perfectly.

As far as I know, with these samples and a feed-forward network trained with backpropagation, you are unlikely to get your desired x=-1/2 and y=1/2. You need a maximum margin classifier for that.

I recommend you to check a SVM linear classifier. You can check SVMlight for multiclass problems.

smmv
+1  A: 

I'd say the results are acceptable. All the points are correctly classified except for the point at (1,0) that is labelled as class 2 and classified as class 1. The problem is that there is also a point at (1,0) labelled as class 1, so it's impossible to separate classes 1 and 2.

Of course, the model is quite probably awful when evaluated on a test set. If you want the decision boundaries to be placed equidistant between points, you need to look at max margin classifiers.

StompChicken