spkit.entropy_cross

spkit.entropy_cross(x, y, base=2, is_discrete=False, bins='fd', verbose=False, pre_version=False, return_n_bins=False, esp=1e-10)

Cross Entropy \(H_{xy}(X,Y)\)

\[H_{xy} = - \sum{Px*log(Py)}\]
Parameters:
x,y1d-arrays
is_discrete: bool, default=False.
  • If True, frequency of unique values are used to estimate \(H_{xy}(X,Y)\)

bins: {str, int, [int, int]}. default=’fd’
  • str decides the method of compute bin-width, bins=’fd’ is considered as optimal bin-width of a real-values signal/sequence.

  • check bin_width for more Methods

  • if bins is an integer, then fixed number of bins are computed for both x, and y.

  • if bins is a list of 2 integer ([Nx, Ny]),then Nx and Ny are number of bins for x, and y respectively.

return_n_bins: bool, default=False
  • if True, number of bins are also returned.

ignoreZero: bool, default=False
  • if true, probabilities with zero value will be omited, before computations

  • It doesn’t make much of difference

Returns:
H_xyCross Entropy H_(x,y)
(N, N)number of bins for x and y, enforce to maximum of both (only if return_n_bins=True)

See also

entropy_joint

Joint Entropy

entropy_cond

Conditional Entropy

mutual_info

Mutual Information

entropy_kld

KL-diversion Entropy

Notes

spkit -todo

Examples

>>> #sp.entropy_cross
>>> import numpy as np
>>> import spkit as sp
>>> np.random.seed(1)
>>> X, fs, ch_names = sp.data.eeg_sample_14ch()
>>> X = X - X.mean(1)[:, None]
>>> x,y1 = X[:,0],X[:,5]
>>> y2 = sp.add_noise(y1,snr_db=0)
>>> H_x = sp.entropy(x)
>>> H_xy1= sp.entropy_cross(x,y1)
>>> H_xy2= sp.entropy_cross(x,y2)
>>> print('Cross Entropy')
>>> print(f'- H_(x,y1) = {H_xy1}')
>>> print(f'- H_(x,y2) = {H_xy2}')
>>> print(f'- H(x) = {H_x}')
>>> np.random.seed(None)
Cross Entropy
- H_(x,y1) = 5.020654071198377
- H_(x,y2) = 6.529035477039111
- H(x) = 4.648381759654535

Examples using spkit.entropy_cross

Entropy - EEG Signal - Multi-Channel

Entropy - EEG Signal - Multi-Channel