Returns the value of the base-2 logarithm of Euler’s number.
LOG2E.CONST()
Example:
LOG2E.CONST()
returns 1.442695041
LOG2E.CONST() =
Calculating Information Entropy
In information theory, the entropy of a discrete random variable is a measure of the uncertainty associated with the variable. The unit of entropy is bits when the base of the logarithm is 2. The formula for the entropy H(X) of a random variable X with outcomes x1,x2,…,xn and probabilities p1,p2,…,pn is:
In some statistical packages or mathematical libraries, the natural logarithm function (ln) is more readily available or computationally efficient. We can use the constant log2(e) to convert the result from nats (the unit of entropy when using natural logarithms) to bits.
Let's say we have a system with four possible outcomes with the following probabilities:
Outcome | Probability (pi) | ||
|---|---|---|---|
A | B | ||
1 | A | 0.5 | |
2 | B | 0.25 | |
3 | C | 0.125 | |
4 | D | 0.125 |
We want to calculate the entropy in bits.
Step 1: Calculate the term piln(pi) for each outcome.
We can use a table to organize our calculations:
Outcome | Probability (pi) | ln(pi) | piln(pi) | ||
|---|---|---|---|---|---|
A | B | C | D | ||
1 | A | 0.5 | ln(0.5)≈−0.6931 | 0.5⋅(−0.6931)≈−0.3466 | |
2 | B | 0.25 | ln(0.25)≈−1.3863 | 0.25⋅(−1.3863)≈−0.3466 | |
3 | C | 0.125 | ln(0.125)≈−2.0794 | 0.125⋅(−2.0794)≈−0.2599 | |
4 | D | 0.125 | ln(0.125)≈−2.0794 | 0.125⋅(−2.0794)≈−0.2599 |
Step 2: Sum the values of piln(pi).
The negative of this sum is the entropy in nats:
Step 3: Convert the result from nats to bits using log2(e)≈1.442695.
For comparison, let's calculate the entropy directly using base-2 logarithms:
Outcome | Probability (pi) | log2(pi) | pilog2(pi) | ||
|---|---|---|---|---|---|
A | B | C | D | ||
1 | A | 0.5 | log2(0.5)=-1 | 0.5⋅(−1)=-0.5 | |
2 | B | 0.25 | log2(0.25)=-2 | 0.25⋅(−2)=−0.5 | |
3 | C | 0.125 | log2(0.125)=-3 | 0.125⋅(−3)=−0.375 | |
4 | D | 0.125 | log2(0.125)=-3 | 0.125⋅(−3)=−0.375 |
Summing the last column:
The entropy is the negative of this sum:
H(X)=1.75 bits
This confirms that multiplying the entropy in nats by the constant log2(e) gives us the correct entropy in bits. This constant acts as a conversion factor between the two logarithmic bases.
PRODUCT & FEATURES
RESOURCES
Terms | Privacy | Spam Policy
© 2026 Zapof