5.6. statsmodels.sandbox.infotheo¶
Information Theoretic and Entropy Measures
5.6.1. References¶
- Golan, As. 2008. “Information and Entropy Econometrics – A Review and
- Synthesis.” Foundations And Trends in Econometrics 2(1-2), 1-145.
- Golan, A., Judge, G., and Miller, D. 1996. Maximum Entropy Econometrics.
- Wiley & Sons, Chichester.
5.6.2. Functions¶
bitstonats (X) |
Converts from bits to nats |
condentropy (px, py[, pxpy, logbase]) |
Return the conditional entropy of X given Y. |
corrent (px, py, pxpy[, logbase]) |
An information theoretic correlation measure. |
covent (px, py, pxpy[, logbase]) |
An information theoretic covariance measure. |
discretize (X[, method, nbins]) |
Discretize X |
gencrossentropy (px, py, pxpy[, alpha, ...]) |
Generalized cross-entropy measures. |
logbasechange (a, b) |
There is a one-to-one transformation of the entropy value from |
logsumexp (a[, axis]) |
Compute the log of the sum of exponentials log(e^{a_1}+...e^{a_n}) of a |
mutualinfo (px, py, pxpy[, logbase]) |
Returns the mutual information between X and Y. |
natstobits (X) |
Converts from nats to bits |
renyientropy (px[, alpha, logbase, measure]) |
Renyi’s generalized entropy |
shannonentropy (px[, logbase]) |
This is Shannon’s entropy |
shannoninfo (px[, logbase]) |
Shannon’s information |