Entropy: Difference between revisions

2 bytes removed ,  2 years ago
m
→‎{{header|Picat}}: Added {{out}}
No edit summary
m (→‎{{header|Picat}}: Added {{out}})
Line 2,213:
"Picat is fun"].map(entropy).println(),
nl.
 
 
% probabilities of each element/character in L
Line 2,224 ⟶ 2,223:
Entropy = -sum([P2*log2(P2) : _C=P in Occ, P2 = P/Len]).</lang>
 
{{out}}
Output:
<pre>[1.846439344671016,3.646513010214172,5.169925001442309,3.251629167387823]</pre>
 
 
=={{header|PicoLisp}}==
495

edits