views:

1084

answers:

5

I have a following scenario – and what I am really looking is real help from real people. Suggestions / Solutions ? Please.

I have an extranet web site for ex. www.foo.com (asp.net 3.5) I am using JQuery 1.3.2 to call ValidateLogin PageMethods in default.aspx page ( www.foo.com/default.aspx)

The code will look like this

$.ajax({
            type: "POST",
            contentType: "application/json; charset=utf-8",
            dataType: "json",
            url: "Default.aspx/ValidateLogin",
            data: '{' + arg + '}',
            success: function(data) {
                if (data.d != 0) {
                    window.location = "http://www.google.com";
                } else {
                    alert("Invalid UserName/Password.");
                    ResetLoginForm();
                }
            },
            error: function(xhr, status, error) {
                var strerror = xhr.status + error;
                alert("Error Communicating with Server:" + strerror);
                ResetLoginForm();
            }
        });

The code is stored in external js file. For ex default.js.

Since this website is public, anyone can download the default.js and thus can take a look at code given above.

My question is once the person gets this url: "Default.aspx/ValidateLogin", he can make a request to server and server will proudly respond to the request.

What are my options here ? how do I validate request ? How do I prevent these kind of unauthorized requests ?

A: 

I'd say there is no problem; However, you probably want to do some Rate Limiting so people can't try brute forcing it.

Chris Shaffer
A: 

You can also put a captcha after say 2-3 login failures which will avoid brute-force.

Vikram
here is a nice asp.net captcha control on codeproject - http://www.codeproject.com/KB/custom-controls/CaptchaControl.aspx
Vikram
+2  A: 

Web requests are by their very nature, public. They don't even need to look at the source file. They could simply monitor the HTTP requests and replay those requests (It's very easy with a tool like fiddler for example).

Problem with Throttling

The throttling solutions are really not viable though they will reduce the rate of attacks. The problem is that an adversary can write a script that runs over a course of days. Then again, he can use proxies to send parallel requests. And if you throttle per user name they the adversary can achieve DoS by attempting false logins on legitimate user names and when the actual user tries to log in, he won't know why he's locked out.

Solution

The typical approach is to use nonce keys. It will require some extra work but it will mitigate the problem and it's only recommended if you really anticipate an onslaught of attacks or something. A simplified version of this would the client passes the nonce key as a URI query parameter. The key is issued by the server to the client in the first place. Once the request has been made, the server validates the nonce key exists in the database and allows the requests. It immediately deletes the nonce key so the user cannot make another request with the same nonce key but then the server would need to issue more nonce keys to the user.

The simplified solution is to only allow authenticated users to make web service requests but since you are designing a login system this obviously doesn't stick.

I wouldn't worry about it unless you've got some nasty adversaries.

aleemb
Thanks a lot man..!
bugBurger
A: 

I'm assuming you've exposed your Page method loads the session, so I guess you could set a session variable when the page loads and then check if it exists when the page method is called. It's not the most secure thing in the world but it'll help.

Personally I wouldn't bother trying to make it any more secure - as others have said, web services are inherently public.

Ravi
A: 

How did your implementation go? I'm attempting a similar project for a ASP .NET 2.0 site and I'm not sure where to start.

GreenEggs
I am using .NET 3.5 and my guess is only 3.5 support pagemethods.
bugBurger