spkit
.entropy_cond¶
- spkit.entropy_cond(x, y, base=2, is_discrete=False, bins='fd', return_n_bins=False, verbose=False, ignoreZero=False)¶
Conditional Entropy \(H(X|Y)\)
\[H(X|Y) = H(X,Y) - H(Y)\]\[0 <= H(X|Y) <= H(X)\]- Parameters:
- x,y1d-arrays
- is_discrete: bool, default=False.
If True, frequency of unique values are used to estimate H(x|y)
- base: base of log, default=2
decides the unit of entropy
if base=2 unit of entropy is in bits, base=e for nats, base=10 for bans
- bins: {str, int, [int, int]}. default=’fd’
str decides the method of compute bin-width, bins=’fd’ is considered as optimal bin-width of a real-values signal/sequence.
check help(spkit.bin_width) for more Methods
if bins is an integer, then fixed number of bins are computed for both x, and y.
if bins is a list of 2 integer ([Nx, Ny]),then Nx and Ny are number of bins for x, and y respectively.
- return_n_bins: bool, (default=False)
if True, number of bins are also returned.
- ignoreZero: bool, default=False,
if true, probabilities with zero value will be omited, before computations
It doesn’t make much of difference
- Returns:
- Hx1yConditional Entropy H(x,y)
- (Nx, Ny)tuple
number of bins for x and y, respectively (only if return_n_bins=True)
See also
entropy_joint
Joint Entropy
mutual_info
Mutual Information
entropy_kld
KL-diversion Entropy
entropy_cross
Cross Entropy
Examples
>>> #sp.entropy_cond >>> import numpy as np >>> import spkit as sp >>> X, fs, ch_names = sp.data.eeg_sample_14ch() >>> X = X - X.mean(1)[:, None] >>> x,y1 = X[:,0],X[:,5] >>> y2 = sp.add_noise(y1,snr_db=0) >>> H_x = sp.entropy(x) >>> H_x1y1= sp.entropy_cond(x,y1) >>> H_x1y2= sp.entropy_cond(x,y2) >>> print('Conditional Entropy') >>> print(f'- H(x|y1) = {H_x1y1}') >>> print(f'- H(x|y2) = {H_x1y2}') >>> print(f'- H(x) = {H_x}') Conditional Entropy - H(x|y1) = 4.096371831484375 - H(x|y2) = 4.260323284620403 - H(x) = 4.648381759654535
Examples using spkit.entropy_cond
¶
Entropy - EEG Signal - Multi-Channel