spkit
.entropy_kld¶
- spkit.entropy_kld(x, y, base=2, is_discrete=False, bins='fd', verbose=False, pre_version=False, return_n_bins=False, esp=1e-10)¶
Cross Entropy Kullback–Leibler divergence \(H_{kl}(X,Y)\)
\[H_{kl} = \sum{Px*log(Px/Py)}\]Cross Entropy - Kullback–Leibler divergence
- Parameters:
- x,y1d-arrays
- is_discrete: bool, default=False.
If True, frequency of unique values are used to estimate H_{kl}(x,y)
- base: base of log, default=2
decides the unit of entropy
if base=2 unit of entropy is in bits, base=e for nats, base=10 for bans
- bins: {str, int, [int, int]}. default=’fd’
str decides the method of compute bin-width, bins=’fd’ is considered as optimal bin-width of a real-values signal/sequence.
check
bin_width
for more Methodsif bins is an integer, then fixed number of bins are computed for both x, and y.
if bins is a list of 2 integer ([Nx, Ny]),then Nx and Ny are number of bins for x, and y respectively.
- return_n_bins: bool, default=False
if True, number of bins are also returned.
- ignoreZero: bool, default=False
if true, probabilities with zero value will be omited, before computations
It doesn’t make much of difference
- Returns:
- H_xyscaler, H_kl(x,y)
Cross entropy Kullback–Leibler divergence
- (N, N)tuple,
number of bins for x and y, enforce to maximum of both (only if return_n_bins=True)
See also
entropy_joint
Joint Entropy
entropy_cond
Conditional Entropy
mutual_info
Mutual Information
entropy_kld
KL-diversion Entropy
entropy_cross
Cross Entropy
Examples
>>> #sp.entropy_kld >>> import numpy as np >>> import spkit as sp >>> np.random.seed(1) >>> X, fs, ch_names = sp.data.eeg_sample_14ch() >>> X = X - X.mean(1)[:, None] >>> x,y1 = X[:,0],X[:,5] >>> y2 = sp.add_noise(y1,snr_db=0) >>> H_x = sp.entropy(x) >>> H_xy1= sp.entropy_kld(x,y1) >>> H_xy2= sp.entropy_kld(x,y2) >>> print('Cross Entropy - KL') >>> print(f'- H_kl(x,y1) = {H_xy1}') >>> print(f'- H_kl(x,y2) = {H_xy2}') >>> print(f'- H(x) = {H_x}') >>> np.random.seed(None) Cross Entropy - KL - H_kl(x,y1) = 0.37227231154384194 - H_kl(x,y2) = 1.8806537173845745 - H(x) = 4.648381759654535
Examples using spkit.entropy_kld
¶
Entropy - EEG Signal - Multi-Channel