spkit.mutual_info_diff_self

spkit.mutual_info_diff_self(X, present_first=True)

Self Mutual Information \(I_{\partial}(X_{i+1}; X_i)\)

Self Mutual Information

Predictibility of \(X(i+1)\) given \(X(i)\)

\[I_{\partial}(X_{i+1}; X_i) = H(X_{i+1}) - H(X_{i+1} | X_i)\]
Parameters:
X: 2d-array,
  • multi-dimentional signal space, where each column (axis=1) are the delayed signals

present_first: bool, default=True
  • if True, X[:,0] is present, and X[:,1:] is past, in incresing order

  • if True, X[:,-1] is present, and X[:,:-1] is past

Returns:
I_x1x: scaler
  • Self-Mutual Information

See also

entropy_diff_joint

Joint-Entropy

mutual_info_diff_self

Self-Mutual Information

References

  • wiki

Examples

#sp.mutual_info_diff_self
import numpy as np
import matplotlib.pyplot as plt
import spkit as sp
X, fs, ch_names = sp.data.eeg_sample_14ch()
X = X - X.mean(1)[:, None]
# Example 1
X1 = sp.signal_delayed_space(X[:,0].copy(),emb_dim=5,delay=2)
Y1 = sp.add_noise(X1,snr_db=0)
I_x1x = sp.mutual_info_diff_self(X1)
I_y1y = sp.mutual_info_diff_self(Y1)
print('Self-Mutual Information')
print(f'- I(X(i+1)| X(i)) = {I_x1x}')
print(f'- I(Y(i+1)| Y(i)) = {I_y1y}')