spkit.show_farmulas

spkit.show_farmulas()

Usuful Formulas



Usuful Formulas


Differential Entropy of Normally distribuated Multivariant X

for .. math:: x ∼ N(μ,Σ)

entropy in nats

\[H(x) = (1/2)ln|Σ| + (1/2)n + (n/2)ln(2π)\]

(1/2)n + (n/2)ln(2π) => are constant values for fixed dimension

code:

H_x = entropy_differential(x,is_multidim=True, emb_dim=1, delay=1,)

Self-Conditional Entropy

Information of X(i+1) given X(i)

\(H(X_{i+1}|X_i) = H(X_{i+1}, X_i) - H(X_i)\)

using: \(H(X|Y) = H(X, Y) - H(Y)\)

code:

H_x1x = entropy_diff_cond_self(X, present_first=True)

Conditional Entropy

Information of X(i+1) given X(i) and Y(i)

\(H(X_{i+1}|X_i,Y_i) = H(X_{i+1},X_i,Y_i) - H(X_i,Y_i)\)

code::

H_x1xy = entropy_diff_cond(X,Y,present_first=True)

Joint Entropy

\(H(X,Y)\)

code:

H_xy = entropy_diff_joint(X,Y)

Joint-Conditional Entropy

\(H(X_{i+1},Y_{i+1}|X_i,Y_i) = H(X_{i+1},Y_{i+1},X_i,Y_i) - H(X_i,Y_i)\)

code:

H_xy1xy = entropy_diff_joint_cond(X,Y,present_first=True)

Self Mutual Information

Predictibility of X(i+1) given X(i)

\(I(X_{i+1}; X_i) = H(X_{i+1}) - H(X_{i+1} | X_i)\)

code:

I_xx = mutual_info_diff_self(X,present_first=True)

Mutual Information

Predictibility of X(i+1) given X(i) and Y(i)

\(I(X_{i+1}; X_i, Y_i) = H(X_{i+1}) - H(X_{i+1} | X_i, Y_i)\)

\(H(X_{i+1}|X_i,Y_i) = H(X_{i+1},X_i,Y_i) - H(X_i,Y_i)\)

code:

I_xy = mutual_info_diff(X,Y,present_first=True)

Transfer Entropy

\(TE_{X->Y} = I(Y_{i+1}, X_i | Y_i)\)

\(TE_{X-->Y} = H(Y_{i+1} | Y_i) - H(Y_{i+1} | X_i, Y_i)\) [Eq1]

\(TE_{X-->Y} = H(Y_{i+1}, Y_i) - H(Y_i) - H(Y_{i+1},X_i,Y_i) + H(X_i,Y_i)\)

\(TE_{X-->Y} = H(X_i,Y_i) + H(Y_{i+1}, Y_i) - H(Y_{i+1},X_i,Y_i) - H(Y_i)\) [Eq2]

Using: \(H(X_{i+1}|X_i) = H(X_{i+1}, X_i) - H(X_i)\) | entropy_diff_cond_self(X) \(H(X_{i+1}|X_i,Y_i) = H(X_{i+1},X_i,Y_i) - H(X_i,Y_i)\) | entropy_diff_cond(X,Y)

code:

TE_x2y = transfer_entropy(X,Y,present_first=True)

Partial Transfer Entropy Or Conditional Transfer Entopry

\(TE_{X-->Y | Z} = I(Y_{i+1}, X_i | Y_i, Z_i)\)

\(TE_{X-->Y | Z} = H(X_i,Y_i, Z_i) + H(Y_{i+1}, Y_i, Z_i) - H(Y_{i+1},X_i,Y_i, Z_i) - H(Y_i, Z_i)\)

code:

TE_x2y1z = partial_transfer_entropy(X,Y,Z,present_first=True,verbose=False)

Granger Causality based on Differential Entropy

  1. GC_XY (X–>Y) : \(H(Y_{i+1}|Y_i) - H(Y_{i+1}|X_i,Y_i)\)

  2. GC_YX (Y–>X) : \(H(X_{i+1}|X_i) - H(X_{i+1}|X_i,Y_i)\)

  3. GC_XdY (X.Y) : \(H(Y_{i+1}|X_i,Y_i) + H(X_{i+1}|X_i,Y_i) - H(X_{i+1},Y_{i+1}|X_i,Y_i)\)

if normalize True

\(GC_XY = GC_XY/(I(Y_{i+1}; Y_i) + GC_XY )\) \(GC_YX = GC_YX/(I(X_{i+1}; X_i) + GC_YX )\)

Using:: \(H(Y_{i+1}|Y_i) = H(Y_{i+1}, Y_i) - H(Y_i)\) \(H(X_{i+1}|X_i) = H(X_{i+1}, X_i) - H(X_i)\) \(H(Y_{i+1}|X_i,Y_i) = H(Y_{i+1},X_i,Y_i) - H(X_i,Y_i)\) \(H(X_{i+1}|X_i,Y_i) = H(X_{i+1},X_i,Y_i) - H(X_i,Y_i)\) \(H(X_{i+1},Y_{i+1}|X_i,Y_i) = H(X_{i+1},Y_{i+1},X_i,Y_i) - H(X_i,Y_i)\)

\(I(X_{i+1}; X_i) = H(X_{i+1}) - H(X_{i+1} | X_i) :math:`I(Y_{i+1}; Y_i) = H(Y_{i+1}) - H(Y_{i+1} | Y_i)\)

code:

gc_xy, gc_yx,gc_xdy = entropy_granger_causality(X,Y,present_first=True, normalize=False)