I tossed together a working solution to the Chunky Monkey challenge in the Basic Algorithm Scripting section using a while loop and splice and had a question regarding the while loop that seems to work but probably isn’t optimal…
function chunkArrayInGroups(arr, size) {
var chunkArray = [];
while(arr.length) {
chunkArray.push(arr.splice(0,size));
}
return chunkArray;
}
chunkArrayInGroups(["a", "b", "c", "d", "e", "f", "g", "h", "i", "j", "k", "l"], 4);
The splice eats away the arr array as it pushes the chunks into chunkArray until the arr array length = 0 and thus the while loop becomes false and ends the loop. I assume this is in poor form and what would be a better option? For loop and cache the length initially?
Thanks!
I blurred out your working solution so that others who might stumble upon it do not see it immediately.
I do not think your solution has poor form. It is actually concise and elegant. The only thing I would caution is using the splice function where large arrays would be present… There is quite a bit of work behind-the-scenes when using slice, because it is basically removing an element and “reordering” the array to adjust for the missing element.
A more efficient approach would be to use the for loop as you mentioned and then make use of the slice function.
function chunkArrayInGroups(arr, size) {
var chunkArray = [];
for (var i=0; i < arr.length; i+=size) {
chunkArray.push(arr.slice(i,i+size));
}
return chunkArray;
}
1 Like
Thank you for the feedback!
That makes sense. Also, for larger arrays I had read that it is a good idea to cache the array length up front as well… something like: var i=0, j=arr.length; i<j; i+=size, does that sound right?
Thanks!
There may or may not be a performance benefit by cache the array length. If there is, it would be negligible. It is an O(1) lookup for the length property of the array and an O(1) lookup on the j variable value of the assign length, so I just do not see how their could be performance difference.
You could always run some tests with very large arrays and compare how long the iteration takes for each scenario. I used the following to test on my local machine using just my Chrome browser and also ran it using Node on my pc. Basically, my code builds an array of one billion numbers and loops through each. I used console.time and console.timeEnd to track the execution time of each loop. I got basically the same results multiple times.
for (var arr=[], e = 0; e < 100000000; e++) {
arr.push(e);
}
console.time("caching arr.length");
for (var i = 0, j = arr.length; i < j; i++) {
//do nothing;
}
console.timeEnd("caching arr.length");
console.log(); // space between test results
console.time("not caching arr.length");
for (var i = 0; i < arr.length; i++) {
//do nothing
}
console.timeEnd("not caching arr.length");
2 Likes