views:

1489

answers:

13

I'm really interested in Neural nets, but I'm looking for a place to start.

What resources are out there and what is a good starting project(s)?

+6  A: 

Here are some example of Neural Net programming. http://www.codeproject.com/KB/recipes/neural_dot_net.aspx

you can start reading here: http://www.geocities.com/CapeCanaveral/Lab/3765/neural.html

I for my part have visited a course about it and worked through some litarature

Regards Friedrich

Friedrich
Geocities went down a few days ago but there's an archived version at http://web.archive.org/web/20071025010456/http://www.geocities.com/CapeCanaveral/Lab/3765/neural.html (at least for now...)
RCIX
+11  A: 

I'd highly recommend this excellent series by Anoop Madhusudanan on Code Project.

He takes you through the fundamentals to understanding how they work in an easy to understand way and shows you how to use his brainnet library to create your own.

Ben Daniel
Wow, cool. You can read it from my blog as well, http://amazedsaint.blogspot.com/2008/01/neural-networks-part-i-simple.html
amazedsaint
nice !! documentation
Cédric Boivin
A: 

I think a good starting point would always be Wikipedia. There you'll find some usefull links to documentations and projects which use neural nets, too.

Peter
+1  A: 

If you don't mind spending money, The Handbook of Brain Theory and Neural Networks is very good. It contains 287 articles covering research in many disciplines. It starts with an introduction and theory and then highlights paths through the articles to best cover your interests.

As for a first project, Kohonen maps are interesting for categorization: find hidden relationships in your music collection, build a smart robot, or solve the Netflix prize.

Corbin March
+6  A: 

Neural Networks are kind of declasse these days. Support Vector Machines and kernel methods are better for more classes of problems then back propagation. Neural networks and genetic algorithms capture the imagination of people who don't know much about modern machine learning but they are not state of the art.

If you want to learn more about AI/Machine learning, I recommend buying and reading Peter Norvig's Artificial Intelligence: A Modern Approach. It's a broad survey of AI and lots of modern technology. It goes over the history and older techniques too, and will give you a more complete grounding in the basics of AI/Machine Learning.

Neural networks are pretty easy, though. Especially if you use a genetic algorithm to determine the weights, rather then proper back propagation.

Chad Okere
I totally agree with you there! Neural networks sound way better than they actually are, especially compared to other methods.
Ray Hidayat
+1 for the Russell and Norvig book - it was my University AI text! I found Neural Nets quite hard, though...
alastairs
The neural network do not consist only of backpropagation; there are tons of other networks - associative memories, Kohonen SOFM's, adaptive reseonance-based networks and so on... MLP and backpropagation are the most popular networks, but not the most performant...
lmsasu
Thanks, I didn't know there were other better options.
PRINCESS FLUFF
+6  A: 

First of all, give up any notions that artificial neural networks have anything to do with the brain but for a passing similarity to networks of biological neurons. Learning biology won't help you effectively apply neural networks; learning linear algebra, calculus, and probability theory will. You should at the very least make yourself familiar with the idea of basic differentiation of functions, the chain rule, partial derivatives (the gradient, the Jacobian and the Hessian), and understanding matrix multiplication and diagonalization.

Really what you are doing when you train a network is optimizing a large, multidimensional function (minimizing your error measure with respect to each of the weights in the network), and so an investigation of techniques for nonlinear numerical optimization may prove instructive. This is a widely studied problem with a large base of literature outside of neural networks, and there are plenty of lecture notes in numerical optimization available on the web. To start, most people use simple gradient descent, but this can be much slower and less effective than more nuanced methods like

Once you've got the basic ideas down you can start to experiment with different "squashing" functions in your hidden layer, adding various kinds of regularization, and various tweaks to make learning go faster. See this paper for a comprehensive list of "best practices".

One of the best books on the subject is Chris Bishop's Neural Networks for Pattern Recognition. It's fairly old by this stage but is still an excellent resource, and you can often find used copies online for about $30. The neural network chapter in his newer book, Pattern Recognition and Machine Learning, is also quite comprehensive. For a particularly good implementation-centric tutorial, see this one on CodeProject.com which implements a clever sort of network called a convolutional network, which constrains connectivity in such a way as to make it very good at learning to classify visual patterns.

Support vector machines and other kernel methods have become quite popular because you can apply them without knowing what the hell you're doing and often get acceptable results. Neural networks, on the other hand, are huge optimization problems which require careful tuning, although they're still preferable for lots of problems, particularly large scale problems in domains like computer vision.

dwf
+2  A: 

I second dwf's recommendation of Neural Networks for Pattern Recognition by Chris Bishop. Although, it's perhaps not a starter text. Norvig or an online tutorial (with code in Matlab!) would probably be a gentler introduction.

A good starter project would be OCR (Optical Character Recognition). You can scan in pages of text and feed each character through the network in order to perform classification. (You would have to train the network first of course!).

graveca
A: 

Two books that where used during my study:

Introductional course: An introduction to Neural Computing by Igor Aleksander and Helen Morton.

Advanced course: Neurocomputing by Robert Hecht-Nielsen

Gamecat
A: 

I found Fausett's Fundamentals of Neural Networks a straightforward and easy-to-get-into introductory textbook.

chaos
+1  A: 

For better understanding of neural networks I recommend free software:

Sharky Neural Network - Neural networks in action

There are many network structures available. You can see neural network learning progress like a movie - live view. SNN is very interesting software.

SharkTime
+1: excellent resource, thank you, thank you, thank you, thank you, thank you, thank you, thank you, .....
lmsasu
A: 

http://www.ai-junkie.com/ann/evolved/nnt1.html is a clear introduction to multi-layers perceptron, although it does not describe the backpropagation algorithm

you can also have a look at generation5.org which provides a lot of articles about AI in general and has some great texts about neural network

A: 

I found the textbook "Computational Intelligence" to be incredibly helpful.

Bradley Powers
A: 

Programming Collective Intelligence discusses this in the context of Search and Ranking algorithms. Also, in the code available here (in ch.4), the concepts discussed in the book are illustrated in a Python example.

jamesaharvey