tags:

views:

437

answers:

2

I am trying to use the kernlab R package to do Support Vector Machines (SVM). For my very simple example, I have two pieces of training data. A and B.

(A and B are of type matrix - they are adjacency matrices for graphs.)

So I wrote a function which takes A+B and generates a kernel matrix.

> km
         [,1]     [,2]
[1,] 14.33333 18.47368
[2,] 18.47368 38.96053

Now I use kernlab's ksvm function to generate my predictive model. Right now, I'm just trying to get the darn thing to work - I'm not worried about training error, etc.

So, Question 1: Am I generating my model correctly? Reasonably?

# y are my classes. In this case, A is in class "1" and B is in class "-1"
> y
[1]  1 -1

> model2 =  ksvm(km, y, type="C-svc", kernel = "matrix");
> model2
Support Vector Machine object of class "ksvm" 

SV type: C-svc  (classification) 
 parameter : cost C = 1 

[1] " Kernel matrix used as input."

Number of Support Vectors : 2 

Objective Function Value : -0.1224 
Training error : 0

So far so good. We created our custom kernel matrix, and then we created a ksvm model using that matrix. We have our training data labeled as "1" and "-1".

Now to predict:

> A
     [,1] [,2] [,3]
[1,]    0    1    1
[2,]    1    0    1
[3,]    0    0    0

> predict(model2, A)
Error in as.matrix(Z) : object 'Z' not found

Uh-oh. This is okay. Kind of expected, really. "Predict" wants some sort of vector, not a matrix.

So lets try some things:

> predict(model2, c(1))
Error in as.matrix(Z) : object 'Z' not found
> predict(model2, c(1,1))
Error in as.matrix(Z) : object 'Z' not found
> predict(model2, c(1,1,1))
Error in as.matrix(Z) : object 'Z' not found
> predict(model2, c(1,1,1,1))
Error in as.matrix(Z) : object 'Z' not found
> predict(model2, km)
Error in as.matrix(Z) : object 'Z' not found

Some of the above tests are nonsensical, but that is my point: no matter what I do, I just can't get predict() to look at my data and do a prediction. Scalars don't work, vectors don't work. A 2x2 matrix doesn't work, nor does a 3x3 matrix.

What am I doing wrong here?

(Once I figure out what ksvm wants, then I can make sure that my test data can conform to that format in a sane/reasonable/mathematically sound way.)

+2  A: 

First off, I have not used kernlab much. But simply looking at the docs, I do see working examples for the predict.ksvm() method. Copying and pasting, and omitting the prints to screen:

 ## example using the promotergene data set
 data(promotergene)

 ## create test and training set
 ind <- sample(1:dim(promotergene)[1],20)
 genetrain <- promotergene[-ind, ]
 genetest <- promotergene[ind, ]

 ## train a support vector machine
 gene <-  ksvm(Class~.,data=genetrain,kernel="rbfdot",\
               kpar=list(sigma=0.015),C=70,cross=4,prob.model=TRUE)

 ## predict gene type probabilities on the test set
 genetype <- predict(gene,genetest,type="probabilities")

That seems pretty straight-laced: use random sampling to generate a training set genetrain and its complement genetest, then fitting via ksvm and a call to a predict() method using the fit, and new data in a matching format. This is very standard.

You may find the caret package by Max Kuhn useful. It provides a general evaluation and testing framework for a variety of regression, classification and machine learning methods and packages, including kernlab, and contains several vignettes plus a JSS paper.

Dirk Eddelbuettel
Right - I was reading ?predict too. In this example, you pass a kernel function ("rbfdot") and the training data ("genetrain") into ksvm(). In my case, my input is a kernel matrixm so ksvm() never has a "data" parameter, so there isn't a clear mapping between the structure of my training data and the structure of my test data.
rascher
If you are curious, I am trying to implement graph kernels into R - so rather than classifying vector data, I am looking at graph data. Thus, my kernel function looks at the number of random walks which are equivalent between two graphs to determine their "distance"
rascher
+4  A: 

If you think about how the support vector machine might "use" the kernel matrix, you'll see that you can't really do this in the way you're trying (as you've seen :-)

I actually struggled a bit with this when I first was using kernlab + a kernel matrix ... coincidentally, it was also for graph kernels!

Anyway, let's first realize that since the SVM doesn't know how to calculate your kernel function, it needs to have these values already calculated between your new (testing) examples, and the examples it picks out as the support vectors during the training step.

So, you'll need to calculate the kernel matrix for all of your examples together. You'll later train on some and test on the others by removing rows + columns from the kernel matrix when appropriate. Let me show you with code.

We can use the example code in the ksvm documentation to load our workspace with some data:

library(kernlab)
example(ksvm)

You'll need to hit return a few (2) times in order to let the plots draw, and let the example finish, but you should now have a kernel matrix in your workspace called K. We'll need to recover the y vector that it should use for its labels (as it has been trampled over by other code in the example):

y <- matrix(c(rep(1,60),rep(-1,60)))

Now, pick a subset of examples to use for testing

holdout <- sample(1:ncol(K), 10)

From this point on, I'm going to:

  1. Create a training kernel matrix named trainK from the original K kernel matrix.
  2. Create an SVM model from my training set trainK
  3. Use the support vectors found from the model to create a testing kernel matrix testK ... this is the weird part. If you look at the code in kernlab to see how it uses the support vector indices, you'll see why it's being done this way. It might be possible to do this another way, but I didn't see any documentation/examples on predicting with a kernel matrix, so I'm doing it "the hard way" here.
  4. Use the SVM to predict on these features and report accuracy

Here's the code:

trainK <- as.kernelMatrix(K[-holdout,-holdout])  # 1
m <- ksvm(trainK, y[-holdout], kernel='matrix')  # 2
testK <- as.kernelMatrix(K[holdout, -holdout][,SVindex(m), drop=F]) # 3
preds <- predict(m, testK)  # 4
sum(sign(preds) == sign(y[holdout])) / length(holdout) # == 1 (perfect!)

That should just about do it. Good luck!

Responses to comment below

what does K[-holdout,-holdout] mean? (what does the "-" mean?)

Imagine you have a vector x, and you want to retrieve elements 1, 3, and 5 from it, you'd do:

x.sub <- x[c(1,3,5)]

If you want to retrieve everything from x except elements 1, 3, and 5, you'd do:

x.sub <- x[-c(1,3,5)]

So K[-holdout,-holdout] returns all of the rows and columns of K except for the rows we want to holdout.

What are the arguments of your as.kernelMatrix - especially the [,SVindex(m),drop=F] argument (which is particulary strange because it looks like that entire bracket is a matrix index of K?)

Yeah, I inlined two commands into one:

testK <- as.kernelMatrix(K[holdout, -holdout][,SVindex(m), drop=F])

Now that you've trained the model, you want to give it a new kernel matrix with your testing examples. K[holdout,] would give you only the rows which correspond to the training examples in K, and all of the columns of K.

SVindex(m) gives you the indexes of your support vectors from your original training matrix -- remember, those rows/cols have holdout removed. So for those column indices to be correct (ie. reference the correct sv column), I must first remove the holdout columns.

Anyway, perhaps this is more clear:

testK <- K[holdout, -holdout]
testK <- testK[,SVindex(m), drop=FALSE]

Now testK only has the rows of our testing examples and the columns that correspond to the support vectors. testK[1,1] will have the value of the kernel function computed between your first testing example, and the first support vector. testK[1,2] will have the kernel function value between your 1st testing example and the second support vector, etc.

Steve Lianoglou
Okay, so I ran your code as written, and it works! But I'm having a little bit of trouble understanding what it does. Could you possibly help me understand: what does K[-noldout,-holdout] mean? (what does the "-" mean?) What are the arguments of your as.kernelMatrix - especially the [,SVindex(m),drop=F] argument (which is particulary strange because it looks like that entire bracket is a matrix index of K?)
rascher
Since it was long, I responded to your comment in my original post -- read the bottom half. If you have more questions, you can always come to the r-help list to ask more questions. Lastly, if my response does answer your question, don't forget to mark it as such ;-)
Steve Lianoglou
Dude. You rock so much. Thank you!
rascher