views:

166

answers:

2

Hello all.

Note: I take care of SQL injection and output escaping elsewhere - this question is about input filtering only, thanks.

I'm in the middle of refactoring my user input filtering functions. Before passing the GET/POST parameter to a type-specific filter with filter_var() I do the following:

Now the question: does it still make sense to pass the parameter to a filter like htmLawed or HTML Purifier, or can I think of the input as safe? It seems to me that these two differ mostly on the granularity of allowed HTML elements and attributes (which I'm not interested into, as I remove everything), but htmLawed docs have a section about 'dangerous characters' that suggests there might be a reason to use it. In this case, what would be a sane configuration for it?

A: 

i think what you're doing is safe, at least from my point of view no html code should get through your filter

Gabriel
-1 Yeah, but javascript can still make it though these filters.
Rook
+1  A: 

There are many different approaches to XSS that are secure. The only why to know if your approach holds water is to test though exploitation. I recommend using the Free Acunetix XSS Scanner, or wapiti.

To be honest I'll never use strip_tags() becuase you don't always need html tags to execute javascript! I like htmlspecialchars($var,ENT_QUOTES); .

For instance this is vulnerable to xss:

print('<A HREF="http://www.xssed.com/'.strip_tags($_REQUEST[xss]).'"&gt;link&lt;/a&gt;');

You don't need <> to execute javascript in this case because you can use onmouseover, here is an example attack:

$_REQUEST[xss]='" onMouseOver="alert(/xss/)"';

The ENT_QUOTES will take care of the double quotes which will patch this XSS vulnerability.

Rook
Very good point! Thank you! I found a list of events at http://php.net/manual/en/function.strip-tags.php#82180 and passed the array to a str_ireplace() - this should take care of it.
djn