Jump to content

Talk:Entropy: Difference between revisions

Line 104:
 
2) Normalized specific entropy: '''H<sub>n</sub> = H / log(n).'''
The division converts the logarithm base of H to n. Units are entropy/symbol or 1/symbol. Ranges from 0 to 1. When it is 1 it means each symbol occurred equally often, n/N times. Near 0 means all symbols except 1 occurred only once, and the rest of a very long file was the other symbol. "Log" is in same base as H.
 
3) Total entropy '''S' = N * H.'''
Line 110:
 
4) Normalized total entropy '''S<sub>n</sub>' = N * H / log(n).''' See "gotcha" below in choosing n.
No units in the same way a ratio does not have units. Notice the formula uses a ratio and the data itself instead of a person chooses the logarithm base. This is the total entropy of the symbols. It varies from 0 to N
Unit is "entropy". It varies from 0 to N
 
5) Physical entropy S of a binary file when the data is stored perfectly efficiently (using Landauer's limit): '''S = S' * k<sub>B</sub> / log(e)'''
Anonymous user
Cookies help us deliver our services. By using our services, you agree to our use of cookies.