Talk:Entropy: Difference between revisions
Content added Content deleted
Line 104: | Line 104: | ||
2) Normalized specific entropy: '''H<sub>n</sub> = H / log(n).''' |
2) Normalized specific entropy: '''H<sub>n</sub> = H / log(n).''' |
||
The division converts the logarithm base of H to n. Units are entropy/symbol. Ranges from 0 to 1. When it is 1 it means each symbol occurred equally often, n/N times. Near 0 means all symbols except 1 occurred only once, and the rest of a very long file was the other symbol. "Log" is in same base as H. |
The division converts the logarithm base of H to n. Units are entropy/symbol or 1/symbol. Ranges from 0 to 1. When it is 1 it means each symbol occurred equally often, n/N times. Near 0 means all symbols except 1 occurred only once, and the rest of a very long file was the other symbol. "Log" is in same base as H. |
||
3) Total entropy '''S' = N * H.''' |
3) Total entropy '''S' = N * H.''' |
||
Line 110: | Line 110: | ||
4) Normalized total entropy '''S<sub>n</sub>' = N * H / log(n).''' See "gotcha" below in choosing n. |
4) Normalized total entropy '''S<sub>n</sub>' = N * H / log(n).''' See "gotcha" below in choosing n. |
||
No units in the same way a ratio does not have units. Notice the formula uses a ratio and the data itself instead of a person chooses the logarithm base. This is the total entropy of the symbols. It varies from 0 to N |
|||
Unit is "entropy". It varies from 0 to N |
|||
5) Physical entropy S of a binary file when the data is stored perfectly efficiently (using Landauer's limit): '''S = S' * k<sub>B</sub> / log(e)''' |
5) Physical entropy S of a binary file when the data is stored perfectly efficiently (using Landauer's limit): '''S = S' * k<sub>B</sub> / log(e)''' |