views:

2688

answers:

5

I am saving user-submitted HTML (in a database). I must prevent Javascript injection attacks. The most pernicious I have seen is the script in a style="expression(...)".

In addition to this, a fair amount of valid user content will include special characters and XML constructs, so I'd like to avoid a white-list approach if possible. (Listing every allowable HTML element and attribute).

Examples of Javascript attack strings are:

1)

"Hello, I have a <script>alert("bad!")</script> problem with the <dog> element..."

2)

"Hi, this <b style="width:expression(alert('bad!'))">dog</b> is black."

Is there a way to prevent such Javascript, and leave the rest intact?

The only solution I have so far is to use a regular expression to remove certain patterns. It solves case 1, but not case 2.

Edit: Sorry, forgot to mention environment - it's essentially the MS stack:

  • SQL Server 2005
  • C# 3.5 (ASP.NET)
  • Javascript (obviously) and jQuery.

I would like the chokepoint to be the ASP.NET layer - anyone can craft a bad HTTP request.

Edit 2:

Thanks for the links everyone. Assuming that I can define my list (he content will include many mathematical and programming constructs, so a whitelist is going to be very annoying) I still have a question here:

What kind of parser will allow me to just remove the "bad" parts? The bad part could be an entire element, but then what about these scripts that reside in the attributes. I can't remove < a hrefs > willy-nilly.

+7  A: 

You think that's it? Check this out.

Whatever approach you take, you definitely need to use a whitelist. It's the only way to even come close to being safe about what you're allowing on your site.

EDIT:

I'm not familiar with .NET, unfortunately, but you can check out stackoverflow's own battle with XSS (http://blog.stackoverflow.com/2008/06/safe-html-and-xss/) and the code that was written to parse HTML posted on this site (http://refactormycode.com/codes/333-sanitize-html) - obviously you might need to change this because your whitelist is bigger, but that should get you started.

Paolo Bergantino
Thanks, I'm actually using that site as a test bed. I've successfully removed anything that looks anything like < s c r i p t >, so I need to get the ones that don't... that is, expression:, javascript:, vbscript: etc. Could you suggest a parser that can do that?
Jeff Meatball Yang
If your approach is to remove dangerous things, your code will be vulnerable to injection. The only safe approach is to have a whitelist of specifically allowed elements and attributes.
Miles
Thanks for the feedback. I was afraid that a whitelist was the answer. :)
Jeff Meatball Yang
+3  A: 

Whitelist for elements and attributes is the only acceptable choice in my opinion. Anything not on your whitelist should be stripped out or encoded (change <>&" to entities). Also be sure to check the values within the attributes you allow.

Anything less and you are opening yourself up to problems - known exploits or those that will be discovered in the future.

See also: XSS (Cross Site Scripting) Cheat Sheet

BarelyFitz
+1  A: 

what server side code are you using? Depending on which there are a number or ways you can filter out malicious script but it's dangerous territory. Even seasoned proffesionals get caught out: http://www.codinghorror.com/blog/archives/001167.html

Chris Simpson
+1  A: 

Basically, as Paolo said, you should try to focus on what the users are allowed to do, rather than trying to filter out the stuff they're not supposed to do.

Keep a list of allowed HTML tags (things like b, i, u...) and filter out everything else. You probably also want to remove all attributes to the allowed HTML tags (because of your second example, for instance).

Another solution would be to introduce so-called BB code, which is what a lot of forums use. It has similar syntax to HTML, but starts with the idea of a whitelist of allowed code, which is then transformed into HTML. For example, [b]example[/b] would result in example. Make sure when using BB code to still filter out HTML tags beforehand.

Aistina
The content I fear, will include many many mathematical and programming constructs (XML, C#, etc.) so I would have loved to avoid a whitelist.
Jeff Meatball Yang
+2  A: 

The only really safe way to go is to use a white-list. Encode everything, then convert the allowed codes back.

I have seen rather advanced attempts to only disallow dangerous code, and it still doesn't work well. It's quite some feat to try to safely catch everything that anyone can think of, and it is prone to do annoying replacements of some things that aren't dangerous at all.

Guffa
I found out the hard way. We are now using escaping and whitelisting.
Jeff Meatball Yang