 # Big-O Complexity of concat() vs push.apply() inside Loops

I am studying Big-O Complexity and I am analyzing common JavaScript functions and algorithms to get a good grasp on the topic. Today, I am analyzing two ways of merging a large number of arrays inspired by the Steamroller problem. Is my Big-O Analysis of these two methods are correct?

Assume that we are given `m` Arrays (inside a parentArray), each having size `n`. We can flatten these arrays into one via either of the following two loops:

``````// concat() way:
// Given parentArray of size m:
var res = [], i;
for (i = 0; i < m; ++i) {
res = res.concat(parentArray[i]);
}
``````
``````// push.apply() way:
// Given parentArray of size m:
var res = [], i;
for (i = 0; i < m; ++i) {
Array.prototype.push.apply(res, parentArray[i]);
}
``````

However, according to my analysis, the two algorithms should have different running times. In case 1, per iteration, `concat()` creates a new Array, copies over the previous values and then puts in the new values. In case 2, per iteration, `push.apply()` mutates the existing Array by just putting in the new values.

Thus, if my math is correct, `concat()` of `m` Arrays of size `n` should have a complexity of

• SUM FROM `i = 1` TO `m` OF `(i - 1)O(n) + O(n)` = `O(square(m)n)` since, at each step `i`, we must copy over the previous Array of size `(i - 1)n` and then copy the new values of the current Array of size `n`.

On the other hand, `push.apply()` of the same should have a complexity of

• SUM FROM `i = 1` TO `m` OF `O(n)` = `O(mn)` since, at each step, we are only copying over the new values of the current Array of size `n` to an existing Array.

Is my analysis correct? I have Googled quite a bit and all I have found are benchmarks instead of an analysis which is what I am after.

3 Likes