Talk:Entropy: Difference between revisions
Content added Content deleted
(→Alternate form: new section) |
(→Description: new section) |
||
Line 22: | Line 22: | ||
For what it's worth.--[[User:Grondilu|Grondilu]] 18:14, 4 March 2013 (UTC) |
For what it's worth.--[[User:Grondilu|Grondilu]] 18:14, 4 March 2013 (UTC) |
||
== Description == |
|||
I think the task is not described correctly. It calculates the average entropy ''per character'', not the entropy of the message as a whole. |
|||
For example, I generated this 100-digit string with 0 through 7 from random.org. It has 300 bits of entropy (more conservatively, 299 to 300 bits of entropy, in case the generation is slightly flawed). The task gives it just under 3: |
|||
<lang parigp>entropy("1652410507230105455225274644011734652475143261241037401534561367707447375435416503021223072520334062") |
|||
%1 = 2.9758111700170733830744745234131842224</lang> |
|||
[[User:CRGreathouse|CRGreathouse]] ([[User talk:CRGreathouse|talk]]) 02:08, 17 April 2013 (UTC) |