Talk:Entropy: Difference between revisions

no edit summary
(→‎Description: new section)
No edit summary
Line 32:
 
[[User:CRGreathouse|CRGreathouse]] ([[User talk:CRGreathouse|talk]]) 02:08, 17 April 2013 (UTC)
 
:I don't know. It's not simple. I was a bit embarrassed with this task as I struggled to understand what the entropy of a string is. Normally entropy is defined for a system whose internal state is only known probabilisticly. A string is known exactly so stricto sensu, the entropy of any string is zero. As I understood it, the task defines the entropy of a string as the entropy of whatever system "generated" the string. The string is thus seen as a representative samples of the probabilities of each possible character, seen as a random variable.
:In your example, you say the entropy is 300 bits, but I'm not sure it makes much sense to say that. 300 bits is the amount of information in this string, if you consider it as the result of picking a random number from 0 to 10^300. The entropy is then zero, for your string is exactly defined.
:The entropy as currently defined in the task does not directly depend on the "size" of the string in memory, because it merely represents the probability table of a stochastic process.
:--[[User:Grondilu|Grondilu]] ([[User talk:Grondilu|talk]]) 15:42, 17 April 2013 (UTC)
1,934

edits