I am creating a Regular Expression in JavaScript which should look for both numbers (at least 1) and letters (at least 1), with the total length being between 6 and 10. I came across some unexpected behavior.
My regex - /^[a-z+\d+]{6,10}$/g.
This doesn't work properly because being in a character class, it checks for letters or numbers, but noth BOTH. Therefore, I would expect "123456" to fail, because while it contains 6 characters, and has at least 1 digit, it does not include 1 letter.
However, in the below code snippet, when I store the regex in the rgx variable and use .test() on it, it somehow correctly returns false, as shown in the second console.log statement. But on the very next line when I directly use the regex with .test(), it returns true.
let rgx = /^[a-z+\d+]{6,10}$/g;
// works fine
console.log(rgx.test("abcd12"));
// returns false
console.log(rgx.test("123456"));
// same regex returns true
console.log(/^[a-z+\d+]{6,10}$/g.test("123456"));
What's going on here?
+in you character class means “allow the + character”, not “one or more of the preceding”.test()function twice with the same regex object and the same testing string: It won't return the same boolean. And here is why: siderite.blogspot.com/2011/11/…^/[\d]$/g, andrgx.test("1")`?