for the probability distribution `qk` when the true distribution is `pk`. It is not computed directly by `entropy`, but it can be computed using two calls to the function (see Examples). See [2]_ for more information. References ---------- .. [1] Shannon, C.E. (1948), A Mathematical Theory of Communication. Bell System Technical Journal, 27: 379-423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x .. [2] Thomas M. Cover and Joy A. Thomas. 2006. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing). Wiley-Interscience, USA. Examples -------- The outcome of a fair coin is the most uncertain: >>> import numpy as np >>> from scipy.stats import entropy >>> base = 2 # work in units of bits >>> pk = np.array([1/2, 1/2]) # fair coin >>> H = entropy(pk, base=base) >>> H 1.0 >>> H == -np.sum(pk * np.log(pk)) / np.log(base) True The outcome of a biased coin is less uncertain: >>> qk = np.array([9/10, 1/10]) # biased coin >>> entropy(qk, base=base) 0.46899559358928117 The relative entropy between the fair coin and biased coin is calculated as: >>> D = entropy(pk, qk, base=base) >>> D 0.7369655941662062 >>> np.isclose(D, np.sum(pk * np.log(pk/qk)) / np.log(base), rtol=4e-16, atol=0) True The cross entropy can be calculated as the sum of the entropy and relative entropy`: >>> CE = entropy(pk, base=base) + entropy(pk, qk, base=base) >>> CE 1.736965594166206 >>> CE == -np.sum(pk * np.log(qk)) / np.log(base) True Nr