views:

54

answers:

1

Hi.

I've set up an ANN with back-propagation as a book recommendation system.

I only have one hidden layer in the network, and both the input and the output layers are books, since you should be able to enter a book and receive more book recommendations.

I already have a lot of data about users and their ratings (1 to 5) for books. I'd like to get it into the neural network (set up all the weights) for this data without having to go through the back-propagation process. Do it somehow more mechanically.

Say each user has about 100 rated books in their library, how do I set the weights for them? How do the books in the user's library relate to the other books in the user's library?

Thanks.

+1  A: 

So it sounds like you don't want to use a neural network. I have found that neural networks are more difficult and take longer to train in these types of sparse feature spaces. You want to predict (I believe) books that a person might like given their ratings and the ratings of people who made similar purchases.

You should look at collaborative filtering. I am not saying that you can't solve the problem with neural networks but I am not sure that you want would really want to.

Steve
Agreed. A neural network is not going to "have data put in it" easily - outside of using a training algorithm such as back propagation. Not say it's not possible, but that's more of a research question for someone interested in NN methinks.A decision tree (or boosting with decision stumps) might be suitable for your application.WEKA has these, and a nice tutorial about how to use them in WEKA can be found here: http://www.ibm.com/developerworks/opensource/library/os-weka1/index.html
Craig W. Wright
I have tried something like this before where I ran an expensive pass over all the examples to estimate the weights and then train from there. For a NN that pass tends to be too expensive. It is cheaper to use distribution passed estimation of weights and start from there.
Steve