What is The Mathematics of Digital Density?
Mathematical Foundation
Laws & Principles
- The Absolute Compression Barrier: Shannon's Source Coding Theorem mathematically proves that the Entropy integer ($H$) is the strict physical limit for lossless data compression. If a file possesses a Shannon Entropy of exactly 4.0 bits per symbol, no file-zipping algorithm anywhere in the universe can ever compress it to 3.9 bits per symbol without permanently destroying data.
- Perfect Uniform Maximization: Entropy hits its mathematical absolute maximum when every single possible outcome is perfectly equally likely (e.g., a perfectly fair coin, or perfectly random encryption keys). Any deviation toward predictability aggressively pulls the entropy number down.
- The Deterministic Zero-Point: If an outcome is 100% predictable (a rigged coin that lands Heads exactly 100% of the time, so $P = 1.0$), the mathematical output of $\log_2(1)$ is exactly $0$. It holds absolutely zero surprise, so it possesses zero information density.
Step-by-Step Example Walkthrough
" A data engineer evaluates the information density of an extremely biased digital sensor that only outputs a '1' 90% of the time, and a '0' 10% of the time. "
- 1. Identify absolute probabilities: $P(1) = 0.90$ and $P(0) = 0.10$.
- 2. Calculate State 1: Multiply $0.90$ by the $\log_2(0.90)$ to get $-0.1368$.
- 3. Calculate State 2: Multiply $0.10$ by the $\log_2(0.10)$ to get $-0.3321$.
- 4. Combine the Sigma summation and negate: $-(-0.1368 -0.3321)$.