views:

232

answers:

3

I recently wrote an extremely basic edge detection algorithm that works on an array of chars. The program was meant to detect the edges of blobs of a single particular value on the array and worked by simply looking left, right, up and down on the array element and checking if one of those values is not the same as the value it was currently looking at. The goal was not to produce a mathematical line but rather a set of ordered points that represented a descritized closed loop edge.

The algorithm works perfectly fine, except that my data contained a bit of noise hence would randomly produce edges where there should be no edges. This in turn wreaked havoc on some of my other programs down the line.

There is two types of noise that the data contains. The first type is fairly sparse and somewhat random. The second type is a semi continuous straight line on the x=y axis. I know the source of the first type of noise, its a feature of the data and there is nothing i can do about it. As for the second type, i know it's my program's fault for causing it...though i haven't a hot clue exactly what is causing it.

My question is: How should I go about removing the noise completely?

I know that the correct data has points that are always beside each other and is very compact and ordered (with no gaps) and is a closed loop or multiple loops. The first type of noise is usually sparse and random, that could be easily taken care of by checking if any edges is next that noise point is also counted as an edge. If not, then the point is most defiantly noise and should be removed.

However, the second type of noise, where we have a semi continuous line about x=y poses more of a problem. The line is sometimes continuous for random lengths (the longest was it went half way across my entire array unbroken). It is even possible for it to intersect the actual edge.

Any ideas on how to do this?

+3  A: 

Normally in image processing a median filter.

You also often do a dilate (make lines bigger) than an erode (make lines thinner) to close up any gaps in the lines

Martin Beckett
I do not need to connect lines. The data for the most part is nearly perfectly clean and any edges that should be edges are picked up perfectly and there are absolutely no gaps. Its the noise that produces gaps. Its the noise i want to remove.
Faken
You can filter the noise but you will reduce the contrast in the image - look at median or low pass filters
Martin Beckett
just finished an hour long meeting with an expert in this field, yes your right a modified median is what i need considering the noise is always a thin line while the edges to detect are always large blobs. I apologize, I'm a mechanical engineer with zero experience in this field, apparently what i was doing wasn't even edge detection at all. Regardless, the median filter is what i need.
Faken
Generally noise is at a different frequency - in this case the noise is high frequency so you smooth it with a median filter
Martin Beckett
A: 

This is the sort of thing that I'll throw into unit tests. Get some minimal datasets that exhibit this problem (something small enough that it can be directly encoded into the test file), run the tests, and with the small dataset just step through and see what's going on.

dash-tom-bang
+2  A: 

Noise tends to concentrate at higher frequencies, so run a low pass filter over the image before you do edge detection. I've seen this principle used to do sub-pixel edge detection.

Mark Ransom
Depends what kind of noise it is: http://en.wikipedia.org/wiki/Colors_of_noise .
Potatoswatter