Multiply Strings - [Leetcode] with JavaScript

I’ve been tacking this problem far too long and I can’t seem to find what is wrong with my logic.

Prompt:

Given two non-negative integers num1 and num2 represented as strings,
return the product of num1 and num2.

Note:

The length of both num1 and num2 is < 110.

Both num1 and num2 contains
only digits 0-9.

Both num1 and num2 does not contain any leading zero.

You must not use any built-in BigInteger library or convert the inputs
to integer directly.

Here is my attempt:

var multiply = function(num1, num2) {
  var result = 0;
  // if any of the nums is 0, automatically it is zero
  if(num1[0] === '0' || num2[0] === '0') {return'0'};
  
  var length1 = num1.length - 1;
  var length2 = num2.length - 1;
  var counterI = 0;
  var iBase10 = 1;
  for(var i = length1; i >= 0; i--){
    iBase10 = Math.pow(10, counterI)
    counterI++;
    var counterJ = 0
    var jBase10 = 1;
    for(var j = length2; j >= 0; j--){
      jBase10 = Math.pow(10, counterJ)
      counterJ++;
      result += (num1[i] * iBase10) * (num2[j] * jBase10)
    }
  }
  return result.toString()
};

Essentially the logic is that I can adding result each multiplication I make starting from the right side of the strings to the left (to all possible combination, hence the nested for loop).

As the indexes moves to the left, the base10 is increased to power of 10 so each calculation is set accordingly.

However, I can’t seem to find what is wrong when I type in as follows:

var result = multiply("123456789","987654321")

The result I get is 121932631112635260, yet the actual answer is 121932631112635269

I’m so slightly close the answer!

Source: LeetCode - The World's Leading Online Programming Learning Platform