Skip to content

Divergence

The information-theoretic toolkit for Python.

Divergence computes statistical measures of entropy, divergence, and dependence from probability distributions and samples. It provides a unified API spanning Shannon measures, f-divergences, Rényi families, integral probability metrics, kNN estimators, score-based measures, optimal transport, and Bayesian MCMC diagnostics.

Why Divergence?

Need Divergence provides
Compare two distributions KL, Jensen-Shannon, Hellinger, TV, energy, MMD, Wasserstein, Sinkhorn
Measure uncertainty Shannon entropy, Rényi entropy, kNN entropy
Detect dependence Mutual information, total correlation, NMI, variation of information
Detect causality Transfer entropy
Test H₀: P = Q Permutation tests with MMD, energy, kNN statistics
Assess MCMC convergence Chain divergence, KSD, two-sample tests, mixing diagnostics
Bayesian diagnostics Information gain, surprise, uncertainty decomposition, prior sensitivity
Goodness-of-fit without Z Kernel Stein discrepancy (RBF + IMQ kernels)

Quick Example

import numpy as np
from divergence import entropy, kl_divergence, two_sample_test

rng = np.random.default_rng(42)
p = rng.normal(0, 1, 5000)
q = rng.normal(0.5, 1.2, 5000)

# Entropy
h = entropy(p)

# KL divergence
kl = kl_divergence(p, q)

# Formal two-sample test
result = two_sample_test(p, q, method="energy", n_permutations=500)
print(f"p-value: {result.p_value:.4f}")

Installation

pip install divergence

For Bayesian diagnostics with ArviZ:

pip install "divergence[bayesian]"

Learn More

Start with the tutorials — nine interactive notebooks that build from Shannon's foundations through end-to-end Bayesian inference, applied case studies, and a goodness-of-fit pass with kernel Stein discrepancy.

For the complete function reference, see the API documentation.