Shannon Entropy Calculator
Enter event probabilities to compute Shannon entropy in bits (or other log bases) along with perplexity.
Use 2 for bits, e for nats, or 10 for bans.
Entropy (bits): 1.485475
Entropy (log base 2.000): 1.485475
Perplexity: 2.800094
| Event | Probability |
|---|---|
| A | 0.200000 |
| B | 0.300000 |
| C | 0.500000 |
How to Use This Calculator
- List events with their probabilities (they do not need to sum exactly to 1; the calculator normalizes them).
- Choose a logarithm base to express entropy in bits, nats, or other units.
- Review entropy and perplexity to understand distribution uncertainty.
- Use normalized probabilities to confirm the distribution used in calculations.
Formula
H = − Σ pi logb(pi)
Perplexity = bH
Normalized probabilities ensure Σpi = 1.
Full Description
Shannon entropy measures the average surprise or uncertainty of a discrete probability distribution. Higher entropy indicates more unpredictability, while lower entropy reflects more certainty.
Perplexity converts entropy into an effective number of equally likely outcomes, providing an intuitive interpretation.
Frequently Asked Questions
Can probabilities exceed 1 after normalization?
No. Normalization divides each value by the sum, ensuring all probabilities are between 0 and 1 and sum to 1.
What happens if an event has probability zero?
Events with zero probability do not contribute to entropy (0 log 0 is treated as 0).
How do I interpret entropy units?
Bits (base 2) represent binary uncertainty, nats (base e) use natural logs, and bans (base 10) express decimal uncertainty.
Is perplexity always ≥ 1?
Yes. Perplexity equals 1 when entropy is zero (certain outcome) and increases with uncertainty.