Entropy: Difference between revisions

1,129 bytes added ,  4 years ago
no edit summary
(→‎{{header|Swift}}: +standard ML)
No edit summary
Line 1,372:
Rosetta Code -> 3.0849625007211556
</pre>
 
=={{header|Lambdatalk}}==
<lang scheme>
{def entropy
 
{def entropy.count
{lambda {:s :c :i}
{let { {:c {/ {A.get :i :c} {A.length :s}}}
} {* :c {log2 :c}}}}}
 
{def entropy.sum
{lambda {:s :c}
{- {+ {S.map {entropy.count :s :c}
{S.serie 0 {- {A.length :c} 1}}}}}}}
 
{lambda {:s}
{entropy.sum {A.split :s} {cdr {W.frequency :s}}}}}
-> entropy
 
The W.frequency function is explained in rosettacode.org/wiki/Letter_frequency#Lambdatalk
 
{def txt 1223334444}
-> txt
{def F {W.frequency {txt}}}
-> F
characters: {car {F}} -> [1,2,3,4]
frequencies: {cdr {F}} -> [1,2,3,4]
{entropy {txt}}
-> 1.8464393446710154
 
{entropy 0}
-> 0
{entropy 00000000000000}
-> 0
{entropy 11111111111111}
-> 0
{entropy 01}
-> 1
{entropy Lambdatalk}
-> 2.8464393446710154
{entropy entropy}
-> 2.807354922057604
{entropy abcdefgh}
-> 3
{entropy Rosetta Code}
-> 3.084962500721156
{entropy Longtemps je me suis couché de bonne heure}
-> 3.8608288771249444
{entropy abcdefghijklmnopqrstuvwxyz}
-> 4.70043971814109
{entropy abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz}
-> 4.70043971814109
 
</lang>
 
=={{header|Lang5}}==