5.6.2.12. statsmodels.sandbox.infotheo.shannonentropy

statsmodels.sandbox.infotheo.shannonentropy(px, logbase=2)[source]

This is Shannon’s entropy

Parameters:

logbase, int or np.e

The base of the log

px : 1d or 2d array_like

Can be a discrete probability distribution, a 2d joint distribution, or a sequence of probabilities.

Returns

—–

For log base 2 (bits) given a discrete distribution

H(p) = sum(px * log2(1/px) = -sum(pk*log2(px)) = E[log2(1/p(X))]

For log base 2 (bits) given a joint distribution

H(px,py) = -sum_{k,j}*w_{kj}log2(w_{kj})

Notes

shannonentropy(0) is defined as 0