views:

1051

answers:

4

I just ran across a question with an answer suggesting the AntiXss library to avoid cross site scripting. Sounded interesting, reading the msdn blog, it appears to just provide an HtmlEncode() method. But I already use HttpUtility.HtmlEncode().

Why would I want to use AntiXss.HtmlEncode over HttpUtility.HtmlEncode?

Indeed, I am not the first to ask this question. And, indeed, Google turns up some answers, mainly

  • A white-list instead of black-list approach
  • A 0.1ms performance improvement

Well, that's nice, but what does it mean for me? I don't care so much about the performance of 0.1ms and I don't really feel like downloading and adding another library dependency for functionality that I already have.

Are there examples of cases where the AntiXss implementation would prevent an attack that the HttpUtility implementation would not?

If I continue to use the HttpUtility implementation, am I at risk? What about this 'bug'?

+9  A: 

I don't have an answer specifically to your question, but I would like to point out that the white list vs black list approach not just "nice". It's important. Very important. When it comes to security, every little thing is important. Remember that with cross-site scripting and cross-site request forgery , even if your site is not showing sensitive data, a hacker could infect your site by injecting javascript and use it to get sensitive data from another site. So doing it right is critical.

OWASP guidelines specify using a white list approach. PCI Compliance guidelines also specify this in coding standards (since they refer tot he OWASP guidelines).

Also, the newer version of the AntiXss library has a nice new function: .GetSafeHtmlFragment() which is nice for those cases where you want to store HTML in the database and have it displayed to the user as HTML.

Also, as for the "bug", if you're coding properly and following all the security guidelines, you're using parameterized stored procedures, so the single quotes will be handled correctly, If you're not coding properly, no off the shelf library is going to protect you fully. The AntiXss library is meant to be a tool to be used, not a substitute for knowledge. Relying on the library to do it right for you would be expecting a really good paintbrush to turn out good paintings without a good artist.

Edit - Added

As asked in the question, an example of where the anti xss will protect you and HttpUtility will not:

http://caught-in-a-web.blogspot.com/2007/01/httputilityhtmlencode-and-server.html

That's according to the author, though. I haven't tested it personally.


It sounds like you're up on your security guidelines, so this may not be something I need to tell you, but just in case a less experienced developer is out there reading this, the reason I say that the white-list approach is critical is this.

Right now, today, HttpUtility.HtmlEncode may successfully block every attack out there, simply by removing/encoding < and > , plus a few other "known potentially unsafe" characters, but someone is always trying to think of new ways of breaking in. Allowing only known-safe (white list) content is a lot easier than trying to think of every possible unsafe bit of input an attacker could possibly throw at you (black-list approach).

David Stratton
Sorry. You hit a sore point with me. You're probably very capable and very good at secure coding. I tend to be a bit, um, hardcore in my approach to security after inheriting a very insecure web site in dire need of improvements.
David Stratton
Great to have your input David. I am particular about security, but never thought about the implementation of HtmlEncode. I want to really understand the impact that the different implementations can have.
g .
Fair enough.. I found an example that you had asked for and included it in my answer above.
David Stratton
In terms of why you'd use one over the other, consider that the AntiXSS library gets released more often than the ASP.NET framework - since 'someone is always trying to think of new ways of breaking in', when someone does come up with one the AntiXSS library is more likely to get an updated release to defend against it.
PhilPursglove
@PhilPursglove, you should post that as an answer. @g may have already accepted an answer, but that's a good enough point that I'd give you an upvote on it.
David Stratton
@David Stratton - Done. For me it's the main advantage of AntiXSS.
PhilPursglove
+2  A: 

We use the white-list approach for Microsoft's Windows Live sites. I'm sure that there are any number of security attacks that we haven't thought of yet, so I'm more comfortable with the paranoid approach. I suspect there have been cases where the black-list exposed vulnerabilities that the white-list did not, but I couldn't tell you the details.

Bruce
+3  A: 

In terms of why you'd use one over the other, consider that the AntiXSS library gets released more often than the ASP.NET framework - since, as David Stratton says 'someone is always trying to think of new ways of breaking in', when someone does come up with one the AntiXSS library is much more likely to get an updated release to defend against it.

PhilPursglove
+1. Good answer!
David Stratton
+3  A: 

Most XSS vulnerabilities (any type of vulnerability, actually) are based purely on the fact that existing security did not "expect" certain things to happen. Whitelist-only approaches are more apt to handle these scenarios by default.

Chris