views:

539

answers:

6

I need a provision of detecting the adult image in the websites where the user uploads the pictures to determine that a picture isn’t acceptable for my site. Can any one suggest the method to do this?

I need kind of open source code/program (PHP) that could be implemented in the website and stops the user to upload the picture. Earlier to the my idea there is a class image filter http://www.phpclasses.org/browse/package/3269.html But I want code which is similar to this or maybe even more advanced.

+10  A: 

Sorry mate, I think the bottom line here is you aren't going to get around manual checking. There are approaches, but none really reliable and with a lot of false positives.

Check out this question: What is the best way to programatically detect porn images? for inspiration.

If you ask me though, it's a waste of time to look for an automated solution. Given half an hour's time, I can find you twenty images that would trigger a "nude" alarm even though perfectly innocent, and the same the other way around. Also, nudity is not going to be the only thing you do not want on your site.

Better spend the time on a system that makes it really easy to manually verify the content users upload to your site, e.g. as a Desktop widget, on the mobile phone, or whatever suits you best.

Pekka
There's lots of good stuff on that linked question. I especially liked the link to Amazon's mechanical turk.
ar
@ar yes, there are approaches, but the mechanical turk costs money. In *most* cases for small sites and projects, manual checking is really the way to go. Correct me if I'm wrong of course - I'd be interested to hear whether anybody is using the Turk or other things successfully to fight bad uploaded content.
Pekka
+1 For acknowledging off the bat it can't really be done.
C. Ross
See my comment below about crowdsifter. That's essentially dolores' labs business model. They are an intelligent, cheaper wrapper around the Turk
mcpeterson
+2  A: 

The only way I know to do this is to have uploads moderated. Images go into a moderation queue and are then "passed" or "rejected" by a human being.

Robusto
+2  A: 

Just an idea. Could maybe come up with a solution that utilizes Tin Eye to see where the image is found. Then pair those results with a website filtering program. If it finds them to be pornographic sites it could give you some level of filtering.

NebuSoft
+1  A: 

All current methods involve the skin color detection. this is never accurate and not even close to advanced.

If at all something advanced exist, that could be used to stop child pornography.

SysAdmin
+3  A: 

I agree with Pekka's comments about automation. That's a rather difficult problem that is not quite solved yet.

Have you thought about http://crowdsifter.com/ or doloreslabs? They can crowdsource a moderated queue for you, or check your existing images on the cheap. If its a business website that might be the trick.

Note: I am not affiliated with dolores labs.

mcpeterson
Nice link, interesting approach!
Pekka
A: 

This is a generic principle regardless of what language/technique. The best way to combat this is to allow the human to check! In short, there would be a lot of work involved, for instance, how can you tell if the image is nude or child pornography by examining the pixels - it cannot be done as there would be a level of sophistication involved, like, there is a trial at the moment in my country for mobile operators to automatically block images being sent across in order to clamp down on pornography, just don't ask me how - but apparently reports of it are successful, I just would not trust whatever algorithm they have used as it may/could generate false positives!

This is quite a similar thing to the usage of Captcha, to block spammers, only the human would have to enter a magic word(s) or number(s) in such way that the analysis of the image will deduce what the image will contain thereby stopping the flow of spammers.

In your case, perhaps, prevent the upload of images, nahh, that's a bit too restricted, do what this site is doing - moderating. You need to moderate the images first (perhaps a queue or a safe holding directory to store the images) and decide if they are suitable or not.

If not, depending on the seriousness of the image - this could involve contacting the ISP and the local law enforcements (this is where the grey area happens - how would you know and not over-react?)....

It would pay to be wise, prudent and just alert the authorities in that case, if it's pornographic nature, inform the ISP, pass the buck onto them, and let them in turn decide the best course of action OR inform a local authority website that reports this kind of thing...like in my country, we have a link to the hotline web-site where people can anonymously post an email to the site and inform that they have encountered a pornographic image...

I am not a lawyer...but IIRC, and this is where the matter can stink up, you can view them, but to download them is illegal...again I have to post this disclaimer - I am not a lawyer, you need to check first....this is what I mean by where the grey area happens...

So in a nutshell, no one is above a computer, and there is a more smarter entity than a computer itself, i.e. a human moderator can only check and ascertain first prior to viewing the image...

Hope this helps, Best regards, Tom.

tommieb75