hoi.metrics.TransferEntropy#

class hoi.metrics.TransferEntropy(x, multiplets=None, verbose=None)[source]#

Pairwise transfer entropy in a dynamical system.

For each pair of variables, compute the transfer entropy:

\[T_{X_i \rightarrow X_j} = I(X_i(t-\tau); X_j(t) | X_j(t-\tau))\]
Parameters:
xarray_like

Standard NumPy arrays of shape (n_samples, n_features) or (n_samples, n_features, n_variables)

multipletslist | None

List of multiplets to compute. Should be a list of pairs, for example [(0, 1), (2, 7)]. By default, all pairs are computed.

Attributes:
entropies

Entropies of shape (n_mult,)

multiplets

Indices of the multiplets of shape (n_mult, maxsize).

order

Order of each multiplet of shape (n_mult,).

undersampling

Under-sampling threshold.

Methods

compute_entropies([method, minsize, ...])

Compute entropies for all multiplets.

fit([minsize, tau, direction_axis, maxsize, ...])

Compute pairwise transfer entropy.

get_combinations(minsize[, maxsize, astype])

Get combinations of features.

__iter__()#

Iteration over orders.

compute_entropies(method='gc', minsize=1, maxsize=None, samples=None, **kwargs)#

Compute entropies for all multiplets.

Parameters:
method{‘gc’, ‘binning’, ‘knn’, ‘kernel}

Name of the method to compute entropy. Use either :

samplesnp.ndarray

List of samples to use to compute HOI. If None, all samples are going to be used.

minsizeint, optional

Minimum size of the multiplets. Default is 1.

maxsizeint, optional

Maximum size of the multiplets. Default is None.

kwargsdict, optional

Additional arguments to pass to the entropy function.

Returns:
h_xarray_like

Entropies of shape (n_mult, n_variables)

h_idxarray_like

Indices of the multiplets of shape (n_mult, maxsize)

orderarray_like

Order of each multiplet of shape (n_mult,)

property entropies#

Entropies of shape (n_mult,)

fit(minsize=2, tau=1, direction_axis=0, maxsize=2, method='gc', samples=None, matrix=False, **kwargs)[source]#

Compute pairwise transfer entropy.

Parameters:
minsize, maxsizeint | 2, 2

Minimum and maximum size of the multiplets. Must be 2.

method{‘gc’, ‘binning’, ‘knn’, ‘kernel’, callable}

Name of the method to compute entropy. Use either :

  • ‘gc’: gaussian copula entropy [default]. See hoi.core.entropy_gc()

  • ‘gauss’: gaussian entropy. See hoi.core.entropy_gauss()

  • ‘binning’: binning-based estimator of entropy. Note that to use this estimator, the data have be to discretized. See hoi.core.entropy_bin()

  • ‘knn’: k-nearest neighbor estimator. See hoi.core.entropy_knn()

  • ‘kernel’: kernel-based estimator of entropy see hoi.core.entropy_kernel()

  • A custom entropy estimator can be provided. It should be a callable function written with Jax taking three 2D inputs of shape (n_features, n_samples) and returning a float.

samplesnp.ndarray

List of samples to use to compute HOI. If None, all samples are going to be used.

tauint | 1

The length of the delay to use to compute the transfer entropy. Default 1

direction_axis{0,2}

The axis on which to consider the evolution, 0 for the samples axis, 2 for the variables axis. Default 0

matrixbool | False

If True, return a (n_features, n_features, n_variables) matrix.

kwargsdict | {}

Additional arguments are sent to each CMI function

Returns:
hoiarray_like

The NumPy array containing pairwise transfer entropy of shape (n_pairs, n_variables), or the full matrix if matrix=True.

get_combinations(minsize, maxsize=None, astype='jax')#

Get combinations of features.

Parameters:
minsizeint

Minimum size of the multiplets

maxsizeint | None

Maximum size of the multiplets. If None, minsize is used.

astype{‘jax’, ‘numpy’, ‘iterator’}

Specify the output type. Use either ‘jax’ get the data as a jax array [default], ‘numpy’ for NumPy array or ‘iterator’.

Returns:
combinationsarray_like

Combinations of features.

property multiplets#

Indices of the multiplets of shape (n_mult, maxsize).

By convention, we used -1 to indicate that a feature has been ignored.

property order#

Order of each multiplet of shape (n_mult,).

property undersampling#

Under-sampling threshold.