views:

100

answers:

4

Ok, let's suppose we have a form which allows people to post their comments. As you know, "bad people" can spam me with help of PHP Curl easily. How can I protect my website against that not including captcha? Thank you.

+4  A: 

If you mean "how do I protect my website against being accessed by a program other than a browser", the answer is pretty much "you can't." A browser is just a program that sends HTTP requests. You can play the game of trying to reject HTTP requests that look like they don't come from a browser, but it's trivially easy for an arbitrary program (one using curl, or Perl/Python/Ruby libraries) to mimic the headers that a "real" browser sends.

Sarfraz
The question is not about this, Sarfraz.
hey
@hey The answer is there: "you can't". And that's it.
Maerlyn
@hey: ***As you know, "bad people" can spam me with help of PHP Curl easily.***?
Sarfraz
The answer is yes, I can. For example if I use captcha, but without it can I still protect my website? You didn't get my question I guess.
hey
@hey: Even captch is not 100% secure in that case, see http://ejohn.org/blog/ocr-and-neural-nets-in-javascript/
Sarfraz
@hey I think Sarfraz got the question perfectly. If this is not what you mean, you need to rephrase it.
Pekka
+1 It's the only actually correct answer.
Wrikken
+1  A: 

What usually works is to generate a random value and store it in the users' session. When I output the form, I add this value as a hidden input.

When the form is posted, compare the value that is posted to the value in the session. Use every value only once (so change it before outputting the form).

Those spambots usually do not store cookies, which means that when the post is performed, you don't have a current session -> no value to compare against, so you know there's something going on. If they do happen to store cookies, they will bump against the form being changed for every post and only the first submit will succeed. If they load the page again each time and parse the form as well, this method doesn't work. But I don't think there is a way to protect against this.

Naturally this means that you need to allocate sessions for every visitor on your site, which also means storing cookies. Its either that or a captcha, I don't know other ways.

Blizz
Wrikken
+1 A fine solution to the the ball rolling.
middaparka
@Wrikken Being realistic if someone wants to emulate a full browser, you can't win anyway - I'm guessing this is just about raising the goal posts.
middaparka
Well, it highly depends on those _"bad people"_, are they just spambots or savvy people with a grudge? I stand by my first comment: yes, you will get some major spambots out of the way, but it's like locking the door but keeping the key outside under the mat.
Wrikken
+1  A: 

well, you could try something along the lines of having a hidden control on the form that contained a valid sessionid. this sessionid would be validated on POST and only if it succeeded, would the post be valid.

this would work with/without javascript. jim

jim
+1  A: 

The easiest way would be to use some form of captcha protection (reCaptcha) or having the users login before they can post comments (this doesn't actually prevent spamming, but it makes it easier to take action against the spammers by blocking their account).

Aside from that you can also take some simple preventive measures ie. limit the number of comments they can perform within a certain timeframe - and maybe only show a captcha when they exceed this limit to determine they are human (like here on SO).

wimvds
@wimvds: Even capcha isn't 100% secure, see http://ejohn.org/blog/ocr-and-neural-nets-in-javascript/
Sarfraz
I know, but [reCaptcha](http://www.recaptcha.com) is pretty safe from OCR attacks and it's even accessible to blind users :p.
wimvds