spkit
.entropy_joint¶
- spkit.entropy_joint(x, y, base=2, is_discrete=False, bins='fd', return_n_bins=False, ignoreZero=False, esp=1e-10)¶
Joint Entropy \(H(X,Y)\)
\[H(X,Y) = \sum {P(x,y)*log(P(x,y))}\]Computing joint probability using histogram2d from numpy
\[max\{H(x),H(y)\} <= H(X,Y) <= H(x) + H(y)\]- Parameters:
- x,y1d-arrays
- is_discrete: bool, default=False.
If True, frequency of unique values are used to estimate H(x,y)
- base: base of log, default=2
decides the unit of entropy
if base=2 unit of entropy is in bits and base=e for nats, base=10 for bans
- bins: {str, int, [int, int]}. default=’fd’
str decides the method of compute bin-width, bins=’fd’ is considered as optimal bin-width of a real-values signal/sequence.
check help(spkit.bin_width) for more Methods
if bins is an integer, then fixed number of bins are computed for both x, and y.
if bins is a list of 2 integer ([Nx, Ny]),then Nx and Ny are number of bins for x, and y respectively.
- return_n_bins: bool, (default=False),
if True, number of bins are also returned.
- ignoreZero: bool, default=True
if True, probabilities with zero value will be omited, before computations
It doesn’t make much of difference
- Returns:
- HxyJoint Entropy H(x,y)
- (Nx, Ny)tuple,
number of bins for x and y, respectively (only if return_n_bins=True)
See also
entropy_cond
Conditional Entropy
mutual_info
Mutual Information
entropy_kld
KL-diversion Entropy
entropy_cross
Cross Entropy
References
wikipedia
Examples
>>> #sp.entropy_joint >>> import numpy as np >>> import matplotlib.pyplot as plt >>> import spkit as sp >>> X, fs, ch_names = sp.data.eeg_sample_14ch() >>> x,y1 = X[:,0],X[:,5] >>> H_xy1= sp.entropy_joint(x,y1) >>> print('Joint Entropy') >>> print(f'- H(x,y1) = {H_xy1}') Joint Entropy - H(x,y1) = 8.52651374518646
Examples using spkit.entropy_joint
¶
Entropy - EEG Signal - Multi-Channel