I tried to do this by looping through a range of letters converted to their ASCII codes, with the range specified by the starting and ending letters of the given string.
But the way I wrote the function gives “undefined”. When I console.logstr.codePointAt(endLetter)), I get “97” which is the same as the one for startLetter and I don’t get why. I tried it charCodeAt as well and had the same result.
Would someone please tell me what I’m doing wrong here?
My code so far
function fearNotLetter(str) {
const startLetter = str[0];
const endLetter = str[str.length - 1];
let missingLetter;
for (let i = str.codePointAt(startLetter); i < str.codePointAt(endLetter); ++i) {
if (!str.includes(String.fromCharCode(i))) {
missingLetter = String.fromCharCode(i);
}
}
return missingLetter;
}
console.log(fearNotLetter("abce"));
My browser information:
User Agent is: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36 Edg/84.0.522.40.
Digging a bit further and becoming more picky, your code doesn’t have to have quadratic complexity, meaning you don’t really need to search string for character inside for loop. Think if you can do it without str.includes
Try converting the values first into numbers and storing those into an array, and just use your counter i to start from 0 to length-1 as you traverse each letter.
Example: “abcd” =
for(let i = 0; i < str.length;i++){
// charCodeAt()
}
[number values 1, number values 2]…
Then convert those numbers from the array to strings and concat each letter with empty str, returning the result.
Never in your code you use actual first letter, but the code point of the first letter. Why would you first extract first letter and then extract code of that letter. One unnecessary action.
It’s fairly easy to drop .includes as you sequentially iterating until the end of the string.
You don’t need to check the first letter, right? Why not staring with the second right away.