Any interesting algorithms for testing performance?

Was curious, if you could suggest any interesting/simple algorithms that would be useful for testing performance?

A simple example would be one for generating a Fibonacci number. I’m also thinking that fractals are an interesting topic and grabbing the Mandelbrot Set might be something of use, since it can eat up quite a bit of performance.

Any ideas, suggestions are much appreciated!

You can also try an algorithm that will calculate a collatz sequence of a number, because the number can get exponentially large. Here’s an example of how it works

//    lets say the input is 13
//    if input is an even number = input/2
//    if input is an odd number = 3(input) + 1

//    13 -> 40 -> 20 -> 10 -> 5 -> 16 -> 8 -> 4 -> 2 -> 1

The number 13 has 10 sequence including itself

It is believed that every collatz sequence of a number will end into a 1

This can be resource consuming.

1 Like

Generating sequences will take as long as it takes. There isn’t much to optimize.

Finding primes is a good problem to practice optimizing code. Also, any NP complete problem.

Edit: though, there are definitely bad ways to code sequences

2 Likes

Generating numbers is easy. Finding answers to a problem is where you can pick slow or fast algorithms.

One of the most common examples is sorting an array.

One of the worse ways to sort an array would be to randomly order the array and then check if its sorted. Obviously if you have a lot of numbers, the odds of it being sorted perfectly may take a while. Imagine a deck of 52 cards, shuffling them flipping them all over and expecting them to find them perfect in order. If not you repeat. This sort of algorithm could work the first time, but on average its performance is completely terrible to the where its very possible the universe will end before it finishes sorting a large array.

On the flip side you could sort the array using something like merge sort and be able to sort even large arrays within a reasonable amount of time.
Both solve the same problem, but one is practical, where-as the other one is unpractical. This also doesn’t really matter what kind of computing power you use, if the data-set is large enough, and the algorithm bad enough, even the fastest computer becomes “to slow”.

You might ask “but in JavaScript I already have the .sort method, why should I care about this?” and you’d be right. This is a “solved problem”, however the underlying solution is that of an actual algorithm, in this case browsers like Firefox actually use merge sort!. Further there are some underlying considers you should take depending on what sorting algorithm your using, such as how the algorithm handles duplicates.

Finally, there are “algorithms” that take a while to run that do “generate” numbers. Something like Graham’s number is massive, that again could take so long to compute the universe would end for larger inputs. Beyond the novelty or getting tied into where this number even comes from (its an estimated upper bound for another problem) there is no “optimization”. The number/function is the answer, or at least part of the answer.

Another fun example would be TREE(3) which is so large that it makes Graham’s number look tiny in comparison. I believe this is the number where if you tried to imagine the number of digits, your head turns into a black hole (!).

3 Likes