| combinations | Combinations of input arguments |
| constDiagMatrix | Constant plus diagonal matrix |
| convergence_rate | Empirical convergence rate of a KL divergence estimator |
| is_two_sample | Detect if a one- or two-sample problem is specified |
| kld_ci_bootstrap | Uncertainty of KL divergence estimate using Efron's bootstrap. |
| kld_ci_subsampling | Uncertainty of KL divergence estimate using Politis/Romano's subsampling bootstrap. |
| kld_discrete | Analytical KL divergence for two discrete distributions |
| kld_est | Kullback-Leibler divergence estimator for discrete, continuous or mixed data. |
| kld_est_brnn | Bias-reduced generalized k-nearest-neighbour KL divergence estimation |
| kld_est_discrete | Plug-in KL divergence estimator for samples from discrete distributions |
| kld_est_kde | Kernel density-based Kullback-Leibler divergence estimation in any dimension |
| kld_est_kde1 | 1-D kernel density-based estimation of Kullback-Leibler divergence |
| kld_est_kde2 | 2-D kernel density-based estimation of Kullback-Leibler divergence |
| kld_est_nn | k-nearest neighbour KL divergence estimator |
| kld_exponential | Analytical KL divergence for two univariate exponential distributions |
| kld_gaussian | Analytical KL divergence for two uni- or multivariate Gaussian distributions |
| kld_uniform | Analytical KL divergence for two uniform distributions |
| kld_uniform_gaussian | Analytical KL divergence between a uniform and a Gaussian distribution |
| mvdnorm | Probability density function of multivariate Gaussian distribution |
| to_uniform_scale | Transform samples to uniform scale |
| tr | Matrix trace operator |
| trapz | Trapezoidal integration in 1 or 2 dimensions |