I'm writing a regex to be used with JavaScript. When testing I came across some strange behavior and boiled it down to the following:
/^[a-z]/.test("abc"); // <-- returns true as expected
/^[a-z]/.test(null); // <-- returns true, but why?
I was assuming that the last case was going to return false since it does not meet the regex (the value is null and thus, do no start with a character in the range). So, can anyone explain me why this is not the case?
If I do the same test in C#:
var regex = new Regex("^[a-z]");
var res = regex.IsMatch(null); // <-- ArgumentNullException
... I get an ArgumentNullException which makes sense. So, I guess when testing a regex in JavaScript, you have to manually do a null check?
I have tried searching for an explanation, but without any luck.