I know about indexes, but this code here is using both var i = 0 and var i = 1. Here it is:
function getLongestWordOfMixedElements(arr) {
if (arr.length === 0) return ''
var words = []
for (var i = 0; i < arr.length; i++) { // <-- here
if (typeof arr[i] === 'string') {
words.push(arr[i])
}
}
if (words.length === 0) return ''
var longestWord = words[0]
for (var i = 1; i < words.length; i++) { // <-- and here
if (words.length > longestWord.length) {
longestWord = words[i]
}
}
return longestWord
}
var output = getLongestWordOfMixedElements([3, 'word', 5, 'up', 3, 1]);
console.log(output); // --> 'word'
I don’t understand why anybody would use var i = 1. Why would you start to iterate at the second value?
Ah, different question that your initial question.
In this case the two for loops are doing two different things.
In the first for loop, the array words is being populated with all elemnts of arr, so you need to loop over every element in arr.
In the second for loop, we have assumed that words[0] is the longest word and are checking every other entry in words to see if any of them are longer. In this case, it makes no sense to check if words[0] is longer than words[0], so i = 0 is skipped and the loop starts at i = 1.