Computes the Shannon entropy of all grouped values.
entropy(xs:list, [normalize=bool]) -> float
Description
Section titled “Description”The entropy
function calculates the Shannon entropy of the values in xs
,
which measures the amount of uncertainty or randomness in the data. Higher
entropy values indicate more randomness, while lower values indicate more
predictability.
The entropy is calculated as: H(x) = -sum(p(x[i]) * log(p(x[i])))
, where
p(x[i])
is the probability of each unique value.
xs: list
Section titled “xs: list”The values to evaluate.
normalize: bool (optional)
Section titled “normalize: bool (optional)”Optional parameter to normalize the entropy between 0 and 1. When true
, the
entropy is divided by log(number of unique values)
. Defaults to false
.
Examples
Section titled “Examples”Compute the entropy of values
Section titled “Compute the entropy of values”from {x: 1}, {x: 1}, {x: 2}, {x: 3}summarize entropy_value=entropy(x)
{ entropy_value: 1.0397207708399179,}
Compute the normalized entropy
Section titled “Compute the normalized entropy”from {x: 1}, {x: 1}, {x: 2}, {x: 3}summarize normalized_entropy=entropy(x, normalize=true)
{ normalized_entropy: 0.946394630357186,}