# Shannon Entropy

What is your hint or solution suggestion?

Shannon entropy (derived from Maxwell-Boltzmann thermodynamics for information theory; see Wiki page here: https://en.wikipedia.org/wiki/Entropy_(information_theory) ) for any given string contains a probability function,

(P(x_i, N) = ( (given value of x_i ) / (N = str.length) ) ).

Entropy can be expressed in the formula below, where x_i is the no. of distinct types of alphanumeric character, up to position i. Any such iterator loop that can pass characters (specific elements) to another blank array can be used (Foreach loop, while loop or a for loop). Using the charAt(index) function/ forEach element accessor, one could subsequently calculate the length of an array comprised of unique alphanumeric characters for any given string or substring therein, and subsequently calculate the sum of H_base2 from all such states (x_i), as a consequence.

H_base2 (X = sum(i=1, n= no. unique characters in string s != N, x_i)) = P(x_i)*[ log_e(1/P(x_i)) / log_e(2)].

This is derived through the definition of probability, and the change of base formula from the log function for Math.log(). For count_i = x_i, one could construct a probability generating function, a function for calculating n, and then use those functions to solve for H_base2. After that, it is a matter of formatting and “promised” function value passing. Unrelated, the complexity of such an algorithm passing successfully would approach O(n^3) in terms of mathematical complexity.

Challenge: Entropy