spkit
.mutual_info_diff¶
- spkit.mutual_info_diff(X, Y, present_first=True)¶
Mutual Information \(I_{\partial}(X_{i+1}; X_i, Y_i)\)
Mutual Information
Predictibility of \(X(i+1)\) given \(X(i)\) and \(Y(i)\)
\[I_{\partial}(X_{i+1}; X_i, Y_i) = H_{\partial}(X_{i+1}) - H_{\partial}(X_{i+1} | X_i, Y_i)\]\[H_{\partial}(X_{i+1}|X_i,Y_i) = H_{\partial}(X_{i+1},X_i,Y_i) - H_{\partial}(X_i,Y_i)\]- Parameters:
- X: 2d-array,
multi-dimentional signal space, where each column (axis=1) are the delayed signals
- Y: 2d-array,
multi-dimentional signal space, where each column (axis=1) are the delayed signals
- present_first: bool, default=True
if True, X[:,0] is present, and X[:,1:] is past, in incresing order
if True, X[:,-1] is present, and X[:,:-1] is past
- Returns:
- I_x1y: scaler
Mutual Information
See also
mutual_info_diff_self
Self-Mutual Information
entropy_diff_joint
Joint-Entropy
References
wiki
Examples
#sp.mutual_info_diff import numpy as np import matplotlib.pyplot as plt import spkit as sp X, fs, ch_names = sp.data.eeg_sample_14ch() X = X - X.mean(1)[:, None] # Example 1 X1 = sp.signal_delayed_space(X[:,0].copy(),emb_dim=5,delay=2) Y1 = sp.signal_delayed_space(X[:,2].copy(),emb_dim=5,delay=2) Y2 = sp.add_noise(Y1,snr_db=0) I_xy1 = sp.mutual_info_diff(X1,Y1) I_xy2 = sp.mutual_info_diff(X1,Y2) print('Mutual-Information') print(f'- I(X1,Y1) = {I_xy1}') print(f'- I(X1,Y2) = {I_xy2}')