Why does the the RegExp /^\w+$/
match undefined
?
Example code:
alert(/^\w+$/.test(undefined));
This will display true in Firefox 3 (only browser I tested it on).
Why does the the RegExp /^\w+$/
match undefined
?
Example code:
alert(/^\w+$/.test(undefined));
This will display true in Firefox 3 (only browser I tested it on).
When undefined
is cast to a string (which the regex does), it produces the string "undefined"
, which is then matched.
See ECMAScript Specification section 15.10.6.2 for RegExp.prototype.exec(string) which will be called from .match method. match basically is exec when it's evaluated to true.
Here is word for word from the specification: Performs a regular expression match of string against the regular expression and returns an Array object containing the results of the match, or null if the string did not match The string ToString(string) is searched for an occurrence of the regular expression pattern as follows:
As you can see it will translate any input to a string, so undefined becomes 'undefined' and that will match to true.
Tested this as well outside the browser using JScript in command line and getting the same result.