Tuesday, October 23, 2007

The Ajax Control Toolkit NoBot control and session state

You need to prevent hackers using robots to run dictionaries of user names and passwords against your login pages. The most common way to achieve that is the use of Captchas ("Completely Automated Public Turing test to tell Computers and Humans Apart"), which display an image with a code which you need to type in a text box. Captchas are often difficult to read and push away genuine users with disabilities. More elaborated Captchas include sound but they are not mainstream.

The NoBot control in the Microsoft Ajax Control Toolkit can be used for any type of request and in particular to protect the sign in function of a login page:

  1. It makes sure too many requests (to sign in) are not issued from the same IP address;
  2. It provides an automated challenge response mechanism to ensure the request (to sign in) is issued by the (login) page;
  3. It enforces a delay between the time the (login) page is displayed and the request (to sign in) is issued.

The benefit of the NoBot control is that this is transparent to the user, contrary to Captchas.

You implement the NoBot control on your ASP.NET page as follows:

<ajaxToolkit:NoBot
ID="PageNoBot"
runat="server"
OnGenerateChallengeAndResponse="PageNoBot_GenerateChallengeAndResponse"
ResponseMinimumDelaySeconds="3"
CutoffMaximumInstances="5"
CutoffWindowSeconds="60"
/>

You implement the OnGenerateChallengeAndResponse event handler in code as follows:

protected
void PageNoBot_GenerateChallengeAndResponse(object sender, AjaxControlToolkit.NoBotEventArgs e)

{


Random r = new
Random();


int iFirst = r.Next(100);


int iSecond = r.Next(100);

e.ChallengeScript = String.Format("eval('{0}+{1}')", iFirst, iSecond);

e.RequiredResponse = Convert.ToString(iFirst + iSecond);

}

Then the documentation tells you to implement the click event handler of the sign in button as follows:

if (!PageNoBot.IsValid()) {


//Display a message that a robot has been detected and the request cannot be processed

}

else {


//Process the postback event

}

On most sites users are redirected to the login page when sessions time out. Because the NoBot control stores the calculation in session state and session state is reset in this case, the calculation challenge response would fail teh next time the user logs in unless he/she refreshes the page.

The following solves the problem from the user perspective, but it also opens the door to hackers:

if ((!Page.Session.IsNewSession) && (!PageNoBot.IsValid()))

Apart from redesigning the NoBot control, not to use session state but to use the context cache instead, I have not found a really good solution to solve this issue.

5 comments:

Anonymous said...

So sorry. I need some help understanding NoBot and would very much appreciate if you could shed some light, please.
If I have a dictionary attack (100's of requests comming into my site a second, all submiting values that appear to be entered in the username/password textboxes - I can't use Login control)...does NoBot help prevent that kind of attack. How please? What to do on NoBot.IsValid() returning false? Redirect I'm guessing won't work. I tried disabling login button control, but is that enough? TIA
I get the fact that I can check to see if someone appears to be bot, but what to do afterwards...maybe if login credentials appear correct, log them out all the same?!
TIA

Jacques L. Chereau said...

1) The only way to really prevent robots is to use Captchas but disabled people complain a lot about them.
2) The NoBot control cannot completely prevent a robot attack, but it will make it more complex to develop and dramatically slower to execute discouraging most attackers unless your site owns such valuable information that it is worth taking the time.
3) The Nobot control does 3 things:
- First, it invalidates too many login requests coming from the same IP address within a configurable time window, e.g. no more than 5 login requests from the same IP address within 2 minutes.
- Second, you can generate a variable pass-phrase which is checked on postback to ensure that the page is loaded before it is posted back, it is as if you loaded a hidden field with a random value and checked that you get this value on postback.
- Third, you can ensure that a minimum time is spent between the time the page loads and the time it is posted back, which is what would occur if a human had to read the page and fill in the information before clicking the submit button.
3) When NoBot.IsValid() returns false, display an error message like "Your request appears to originate from a robot and has been invalidated. If this is not the case, please wait 1 minute and resubmit your request", so that a genuine user experiencing an invalidation gets some feedback: this can happen especially when the browser prefills the login information and the login button is triggered by the carriage return key.
I hope this helps.

Omar said...

Let say there are X number of people on a network (LAN). For the sake of this example, let say all of them are trying to access the same site at the exact same time.

Now, since the outside IP address for all of them would be the same, how would NoBot know if request is originated from a bot or not?

Mike Adewole said...

The first and third things listed have a problem because there is no sensible way to determine the proper parameter values for clients on a public website.

The second thing listed (i.e. passphrase generation) will not defeat many bots because they tend to load the page before doing a postback, so the passphrase will be correctly included in the postback data.

One option is to use bot databases, many of which are freely available on the internet. Commercial bot databases that integrate with asp.net are yet another option (e.g. www.botslist.ca).

Pravesh Singh said...

Very informative post. Its really helpful for me and beginner too. Check out this link too its also having a nice post with wonderful explanation on Ajax Toolkit NoBot control in ASP.Net...

Ajax Toolkit NoBot control in ASP.Net

Thanks