views:

2908

answers:

18

I'd like to know about specific problems you - the SO reader - have solved using artificial neural network techniques and what libraries/frameworks you used if you didn't roll your own.

Questions:

  • What problems have you used artificial neural networks to solve?
  • What libraries/frameworks did you use?

I'm looking for first-hand experiences, so please do not answer unless you have that.

+2  A: 

I know 3 good Neural-Network libraries:

  1. AForge is my favorite. It is written in .NET (there is a CodeProject article on AForge)

  2. The good and the old library is FANN

  3. NeuronDotNet is another good library.

Also there is more information about Neural Net from these articles on CodeProject

DreamWalker
what problems did you solve using ANN?
Hannson
+5  A: 

I used FANN to do audio signal classification (not speech recognition). It was a background toy project mainly to learn about NN. Enjoyed it and learnt a lot; FANN was dead easy to understand and get running (considering I knew nothing beforehand).

Unfortunately there are patents covering a vast range of neural net applications and its easy to fall foul, so bear that in mind if its anything commercial.

Stuart
Just adding this comment to remember this answer because you cannot star answers, only questions.
clyfe
+13  A: 
KushalP
55-65% accuracy? You must now be a very wealthy man.
Mick
@Mick Haha, sadly not. I don't have the kind of money required to use a high rate brokerage system and those figures were from one of the currency pairs. Not all of them faired so well.
KushalP
+8  A: 

I used AForge to decide if multiple-choice answer bubbles had been filled-in, checked, crossed-out, etc.

Doug McClean
+7  A: 

I've used artificial neural networks to predict the shear strength of reinforced concrete columns, as well as their rotational deformation capacity. This is a problem that has many independent variables and extremely nonlinear results, perfect for ANNs.

I compiled a large database of tests from many sources to train the network and judge its accuracy. Applications of neural networks to solve Civil Engineering problems are not uncommon.

I had used an application called Trajan to do some of this work, but I intend on rolling my own solution and revisiting the problem because I need more control over the data.

snicker
This sounds like an awesome application of NN, but I'm surprised you were able to get so much training data for this. Doesn't a "shear strength" data point imply that something has been "broken" to get the data? Or do people in labs spend all day breaking stuff to get the numbers? Just curious...
Caffeine Coma
People actually *do* spend days, weeks, months, and even years forming and casting concrete specimens in structural engineering labs for the sole purpose of breaking them. Concrete is such a "random" material with so many variables that its behavior is extremely hard to predict, and the only way to verify predictions is with tests. Some of the data I have used is from as far back as 1968.
snicker
+3  A: 

My PhD was investigating using ANNs for a couple of different image processing related problems. The abstract and references to publications are here:

http://www.ademiller.com/tech/reports/thesis%5Fabstract.htm http://www.ademiller.com/blogs/tech/about-2/

The work addressed image processing and data analysis problems using back propagation networks and Kohonen self organizing maps:

1) Classifying objects on an image taken using a large astronomical telescope. Objects are either stars (point like and spherical) or galaxies (diffuse and axially assymetric). A system based on a back propagation network with image pre-processing achieved comparable results to other existing methods.

2) Characterising images of chest cavities captured by an electrical impedence tomography system. The goal was to automatically estimate lung volume using a neural network. This was shown to work with simulated data but never proven in a clinical trial.

3) Reconstruct incoming gamma ray trajectories from events within a gamma ray telescope. This is a very hard problem to solve my any means and ANNs didn't produce good results.

I used a code called PlaNet and some home grown code I wrote as part of my work. Looks like it's still around:

http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/planet/0.html

As is Kohonen's software for maps:

http://www.cis.hut.fi/somtoolbox/theory/somalgorithm.shtml

Ade Miller
+1  A: 

We (I was on a team) that used neural networks for sound recognition (think underwater and land based miltary issues). Tracking of items that are moving through time and space. All very cool stuff. We built our own hardware and software that blew away the speed of traditional computers. We also worked most of the traditional NN problems.

When you get down to it, NN's just need to multiply really really fast.

TNT
+73  A: 
Nate Kohl
Very interesting.
David
+1 Really interesting stuff, especially the NEAT algorithms. That link will have my spare time for breakfast ;)
Cloud
+1 I also found that really interesting. Many thanks
Basiclife
The NEAT algorithms are really cool. A bit difficult to implement (essentially tracking the history of a mutation, IIRC, and then using the history in the crossover), but quite biologically sound.
jamesh
+8  A: 

I did a PhD in neural networks. In it I solved several problems related to time series. For example I modeled a mechanism for recalling sequences of patterns (rather like remembering a phone number). We do this with a system where the part of the sequence recited so far reminds you of what pattern comes next (that's why its very hard to recall your phone number backwards). My PhD is online here.

I wrote all my own simulation software in C.

Mick
+3  A: 

My thesis was about the prediction and evaluation of textile elasticity from an industrial robot (RV-4A) under tension. At first, about 20 different textile pieces were evaluated by an expert and were assigned an empirical elasticity value in the range of 0 to 1. Then, the robot performed a tension test of all these pieces, and the elasticity curve (elongation-force) was recorded. Those data were then fed in a neural network, with the purpose to be able to predict the textile's empirical elasticity value, as soon as possible, and without being fed the whole curve. The final trained neural network was used be the robot to classify in Real Time fabrics and textiles without finishing the tensile test, something that modelled the real empirical way of classifying textiles by an expert. The results were very good. For more info see

Intelligent evaluation of fabrics' extensibility from robotized tensile test, Panagiotis N. Koustoumpardis, John S. Fourkiotis, Nikos A. Aspragathos, International Journal of Clothing Science and Technolory, Year:2007 Volume:19 Issue:2 Page:80 - 98

I didn't use any NN library, the algorithms used were Resilient Propagation, and LM, and were programmed from scratch.

John
+2  A: 

I used MATLAB's Neural Networks Toolbox for some rudimentary handwritten-digit classification. My impression is that the Neural Networks Toolbox is very powerful/useful, but the 900-something page documentation is a real beast (both for size and quality), so I can't say I enjoyed working with it a whole lot.

MarkLu
+1  A: 

About 12 years ago, working at CEDAR, I used SNNS to train a fast machine print alphanumeric character recognizer and to discriminate between hand written envelopes and machine printed ones. Matlab was useful to generate different kinds of image features.

Actually writing neural network code is not difficult. Usually the kinds of problems where you want to use NN are very data driven. The issues are more about playing with the data, generating useful features and finding statistical models that work. The software issues are more about efficiently generating features and executing models.

Sean McCauliff
+5  A: 

Although I've used -with variable success- NN to recognize text patterns (like part number and such), the coolest Neural Net implementation I did was for very simple game which I developed in the context of a challenge/contest for users of Numenta NuPIC framework.

I didn't submit this game for the contest, owing to its incomplete user interface and general "roughness around the edges", the neural network portion of the project, however, was functional and worked rather well.

I realize that Numenta's Hierarchical Temporal Memory (HTM) concept implemented in NuPIC (I was using version 1.3 at the time) is somewhat in departure with traditional Neural Network framworks, but it may be worthy of notice in this SO posting.

The game is one where the player has to learn to communicate with a "pet" (or a "alien being"...) implemented as an HTM network. The mean of communication is by exchanging [imperfect] messages drawn on a small square grid, and to "act" accordingly by pressing a particular action button. The idea is to develop a "language" of sorts to express basic concept (food, water, inside, outside, playing, ball, stick, "I need to sleep" etc.) in a consistent fashion and so that the other party understands them.

The Neural Net portion of the project was derived from the image recognition demo which ships with NuPIC, but included a few twists such as the automatic erasing of the dots that make up the image, a certain amount of time after they are drawn, and also the on-going mix-mode learning/recognition, whereby the demo has these two phases well separated.

The interesting part of this project was how it leveraged the extreme resilience to noise and imprecision in the message being submitted for recognition. HTMs are well known for this feature.

Maybe I should rekindle this, again, very basic / geeky, game and provide it, open-source fashion on Numenta's Site or elsewhere. Another project for when I retire ;-)

mjv
+3  A: 

I'm using neural networks at the heart of a proprietary library that performs handwriting recognition on pen based input

Gregory Pakosz
+21  A: 

In 2007 I was part of a group of master students put to the task of classifying ground (vs. buildings, cars, trees, etc.) in a photograph.

The project was focused on image processing and understanding, where the task was to attempt to extrapolate parts of panoramic 360° photographs. For example, we would take the photograph below (taken with a customized vehicle) and attempt to discover the ground cover (i.e. the road, sidewalk, etc.) in the photo.

Panoramic photograph of a street in Utrecht

If we extrapolate the ground plane of the previous image by hand, we would probably agree upon an image resembling:

Manual segmentation of a panoramic photograph

We can then consider this the ground truth.

The application our research group developed, Ground Plane Classification (GPC) uses a six-step taxonomy (proposed by M. Egmont-Petersen et al., 2002) composing of: pre-processing, data reduction, segmentation, object detection and image understanding (and optimization throughout). The classification occurs in the image understanding phase, which features a Feed Forward Artificial Neural Network specially trained using a training set of panoramic photographs.

Our results typically give a margin of error of about 3 to 4%. The automatically classified image below boasts an error rate of only 1.1%.

Automatic segmentation of a panoramic photograph

Originally, we planned on taking GPS coordinates into account, but that didn't work out in the end as (a) they aren't accurate enough and (b) we don't have a map that resembles structures in the desired detail.

Feel free to read more about it!

Paul Lammertsma
+1  A: 

I've implemented a system for Iris Recognition using backpropagation neural networks in Matlab. I did use the matlab ANN toolbox and if you're interested, you can download a brief paper I wrote describing my experiences.

The paper includes not only material on the EBNN but also on the other techniques used in the whole system, like image segmentation and feature extraction using independent component analysis (ICA).

Padu Merloti
A: 

@DreamWalker i also plan to use neurodotnet for my phd thesis, and in this regard i need your help/guidance.

Before i get down to modeling my real problem using neurodotnet i just want to build some small solutions to get used to the dll structure. the first problem that i want to model using backward propagation is height-weight ratio. I have some height and weight data, i want to train my NN so that if i put in some weight then i should get correct height as a output. i have 1 input 1 hidden and 1 output layer.

Now here is first of many things i cant get around :) 1. my height data is in form of 1.422, 1.5422 ... etc and the corresponding weight data is 90, 95, but the NN takes the input as 0/1 or -1/1 and given the output in the same range. how to address this problem.

You should post this as a separate question - StackOverflow has questions and answers, not threads (as is typical of Internet forums).
Matt Parker
+4  A: 

I rolled my own Neural Network based autopilot for an autonomous helicopter. It used Cascade correlation, and online reinforcement learning (positive reinforcement for things like flying level, being near a GPS waypoint; negative reinforcement for things like crashing, etc.)

Bradley Powers
+1. Sounds interesting. I'll have to read up on cascade correlation. Your answer prompted me to as a related question that you may some insight on: http://stackoverflow.com/questions/3068658/what-techniques-exist-for-the-software-driven-locomotion-of-a-bipedal-robot
Drew Noakes
BTW how successful was the NN in the end? Did it deal with unexpected issues (gusts of wind, etc) ok?
Drew Noakes
Actually, reinforcement learning looks very useful for what I'm trying to do. Thanks for the tip.
Drew Noakes
The NN was actually incredibly effective, it was able to deal with crazy Boston winds, being pushed, etc. Reinforcement learning is incredibly useful, particularly for being relatively simple. You can get great results from it if you're careful with how you structure the reinforcement. Sometimes you have to be clever, as measuring how good or bad the autopilot is performing can be difficult (In my case, the issue was with GPS performance, in that it tends to drift randomly, regardless of the movement of the sensor).
Bradley Powers