Edit: Thank you @mario for pointing that it all depends on the context. There really is no super way to prevent it all on all occasions. You have to adjust accordingly.
Edit: I stand corrected and very appreciative for both @bobince and @Rook's support on this issue. It's pretty much clear to me now that strip_tags
will not prevent XSS attacks in any way.
I've scanned all my code prior to answering to see if I was in any way exposed and all is good because of the htmlentities($a, ENT_QUOTES)
I've been using mainly to cope with W3C.
That said I've updated the function bellow to somewhat mimic the one I use. I still find strip_tags
nice to have before htmlentities so that when a user does try to enter tags they will not pollute the final outcome. Say user entered: <b>ok!</b>
it's much nicer to show it as ok!
than printing out the full text htmlentities converted.
Thank you both very much for taking the time to reply and explain.
If it's coming from internet user:
// the text should not carry tags in the first place
function clean_up($text) {
return htmlentities(strip_tags($text), ENT_QUOTES, 'UTF-8');
}
If it's coming from the backoffice... don't.
There are perfectly valid reasons why someone at the company may need javascript for this or that page. It's much better to be able to log and blame than to shut down your uers.