views:

1226

answers:

4

I'm working on a fairly large web site built in PHP that will potentially have a lot of users. I'm looking into a way to protect the login screen from automated attempts. I have already included a CAPTCHA check on the registration form, yet want to harden the site more.

There have been similar questions on StackOverflow that I know of, and I know I'm capable of implementing this myself from scratch (storing login attempts and their time in the db), yet I dislike that path:

  • Conceptually, I think this kind of logic belongs at the web server/infrastructure level, not the application level. I dislike having this logic and complexity in my application
  • I worry about performance, particularly at the database level.
  • I'm lazy, in a good way, by not wanting to build a common utility like this from scratch

Any advise is appreciated, I think that I'm particularly looking for some kind of Apache module that can do this. My platform is PHP5 (using CodeIgniter), Apache2, MySQL 5.

+4  A: 

a good start would be to just sleep(1); after a failed login attempt - easy to implement, almost bug-free.

1 second isn't much for a human (especially because login attempts by humans don't fail to often), but 1sec/try brute-force ... sloooow! dictionary attacks may be another problem, but it's in the same domain.

if the attacker starts too may connections to circumvent this, you deal with a kind of DOS-attack. problem solved (but now you've got another problem ^^).

some stuff you should consider:

  • if you lock accounts soley on a per IP basis, there may be problems with private networks.
  • if you lock accounts soley on a username basis, denial-of-service attacks agains known usernames would be possible
  • locking on a IP/username basis (where username is the one attacked) could work better

my suggestion: complete locking is not desireable (DOS), so a better alternative would be: count the login attempts for a certain username from a unique IP. you could do this with a simple table failed_logins: IP/username/failed_attempts

if the login fails, wait(failed_attempts); seconds. every xx minutes, run a cron script that decreases failed_logins:failed_attempts by one.

sorry, i can't provide a premade solution, but this should be trivial to implement.

okay, okay. here's the pseudocode:

<?php
$login_success = tryToLogIn($username, $password);

if (!$login_success) {
    // some kind of unique hash
    $ipusr = getUserIP() . $username;

    DB:update('INSERT INTO failed_logins (ip_usr, failed_attempts) VALUES (:ipusr, 1) ON DUPLICATE KEY UPDATE failed_logins SET failed_attempts = failed_attempts+1 WHERE ip_usr=:ipusr', array((':ipusr' => $ipusr));

    $failed_attempts = DB:selectCell('SELECT failed_attempts WHERE ip_usr=:ipusr', array(':ipusr' => $ipusr));

    sleep($failed_attempts);
    redirect('/login', array('errorMessage' => 'login-fail! ur doin it rong!'));
}
?>

disclaimer: this may not work in certain regions. last thing i heard was that in asia there's a whole country NATed (also, they all know kung-fu).

Schnalle
nice and simple solution I agree, but it still requires database interaction. What if I simply add the 1 sec delay between the failed login detection and the re-rendering of the login screen? This would force both bots and humans to have to wait for the delay. For a human it is not much, and few humans will do a wrong login, whilst for a bot it is quite long. What do you think?
Ferdy
that doesn't really help, because i can do N concurrent requests. the 1 second block happens on a per-request basis, so the N concurrent requests finish after a bit more than 1 second (given the server can handle the load). so in principle, i could launch 10.000 requests and have them finishing after some seconds, because they aren't serial.
Schnalle
also, there are > 80.000 seconds a day. 80.000 isn't much for brute force, but it's probably enough for a dictionary attack.
Schnalle
+1  A: 

A very dummy untested example, but I think, you will find here the main idea ).

if ($unlockTime && (time() > $unlockTime))
{
    query("UPDATE users SET login_attempts = 0, unlocktime = 0 ... ");
}
else
{
   die ('Your account is temporary locked. Reason: too much wrong login attempts.');
}
if (!$logged_in)
{
    $loginAttempts++;
    $unlocktime = 0;
    if ($loginAttempts > MAX_LOGIN_ATTEMPTS) 
    {
        $unlockTime = time() + LOCK_TIMEOUT;
    }
    query("UPDATE users SET login_attempts = $loginAttempts, unlocktime = $unlocktime ... ");
}

Sorry for the mistakes - I wrote it in some seconds ad didn't test... The same you can do by IP, by nickname, by session_id etc...

Jet
A: 

Why don't you wait with "hardening" and "scaling" your app until you actually have that problem? Most likely scenario is that the app will never have "a lot of users". This sounds like premature optimization to me, something to avoid.

  • Once you get bots abusing, harden the signup. I'd actually remove the captcha until you start getting > 1000 signups/day.
  • Once you get performance problems, improve performance by fixing real bottlenecks.
Whilst you may be right conceptually, you do not know the background of my project and cannot conclude that it will likely never get these problems. I consider these protections best practices no matter the size of the website. If I do not protect against login attempts, passwords can be stolen. How is it premature to protect against that?The same goes for Captchas. I am using my own custom blog software, very much a niche. I only have 300 readers and still bots attack it with comment spam. A Captcha is a must-have in my experience.
Ferdy
A: 

The answer above has one major flaw - it doesn't check the number of attempts until AFTER the attempt to login. So the user can keep retrying and it'll simply redirect them to the error page after another unsuccessful login...then they simply go back and try again. The check for login attempts has to occur first.

somegirl