hoi.metrics.RedundancyphiID#

class hoi.metrics.RedundancyphiID(x, multiplets=None, verbose=None)[source]#

Redundancy (phiID).

Estimated using the Minimum Mutual Information (MMI) as follow:

\[Red(X,Y) = min \{ I(X_{t- \tau};X_t), I(X_{t-\tau};Y_t), I(Y_{t-\tau}; X_t), I(Y_{t-\tau};Y_t) \}\]
Parameters:
xarray_like

Standard NumPy arrays of shape (n_samples, n_features) or (n_samples, n_features, n_variables)

multipletslist | None

List of multiplets to compute. Should be a list of multiplets, for example [(0, 1), (2, 7)]. By default, all multiplets are going to be computed.

Attributes:
entropies

Entropies of shape (n_mult,)

multiplets

Indices of the multiplets of shape (n_mult, maxsize).

order

Order of each multiplet of shape (n_mult,).

undersampling

Under-sampling threshold.

Methods

compute_entropies([method, minsize, ...])

Compute entropies for all multiplets.

fit([minsize, tau, direction_axis, maxsize, ...])

Redundancy (phiID).

get_combinations(minsize[, maxsize, astype])

Get combinations of features.

References

Pedro AM Mediano et al, 2021 [19]

__iter__()#

Iteration over orders.

compute_entropies(method='gc', minsize=1, maxsize=None, samples=None, **kwargs)#

Compute entropies for all multiplets.

Parameters:
method{‘gc’, ‘binning’, ‘knn’, ‘kernel}

Name of the method to compute entropy. Use either :

samplesnp.ndarray

List of samples to use to compute HOI. If None, all samples are going to be used.

minsizeint, optional

Minimum size of the multiplets. Default is 1.

maxsizeint, optional

Maximum size of the multiplets. Default is None.

kwargsdict, optional

Additional arguments to pass to the entropy function.

Returns:
h_xarray_like

Entropies of shape (n_mult, n_variables)

h_idxarray_like

Indices of the multiplets of shape (n_mult, maxsize)

orderarray_like

Order of each multiplet of shape (n_mult,)

property entropies#

Entropies of shape (n_mult,)

fit(minsize=2, tau=1, direction_axis=0, maxsize=None, method='gc', samples=None, **kwargs)[source]#

Redundancy (phiID).

Parameters:
minsize, maxsizeint | 2, None

Minimum and maximum size of the multiplets

method{‘gc’, ‘binning’, ‘knn’, ‘kernel’, callable}

Name of the method to compute entropy. Use either :

  • ‘gc’: gaussian copula entropy [default]. See hoi.core.entropy_gc()

  • ‘gauss’: gaussian entropy. See hoi.core.entropy_gauss()

  • ‘binning’: binning-based estimator of entropy. Note that to use this estimator, the data have be to discretized. See hoi.core.entropy_bin()

  • ‘knn’: k-nearest neighbor estimator. See hoi.core.entropy_knn()

  • ‘kernel’: kernel-based estimator of entropy see hoi.core.entropy_kernel()

  • A custom entropy estimator can be provided. It should be a callable function written with Jax taking a single 2D input of shape (n_features, n_samples) and returning a float.

samplesnp.ndarray

List of samples to use to compute HOI. If None, all samples are going to be used.

tauint | 1

The length of the delay to use to compute the redundancy as defined in the phiID. Default 1

direction_axis{0,2}

the axis on which to consider the evolution. 0 for the samples axis, 2 for the variables axis Default 0

kwargsdict | {}

Additional arguments are sent to each MI function

Returns:
hoiarray_like

The NumPy array containing values of higher-order interactions of shape (n_multiplets, n_variables)

get_combinations(minsize, maxsize=None, astype='jax')#

Get combinations of features.

Parameters:
minsizeint

Minimum size of the multiplets

maxsizeint | None

Maximum size of the multiplets. If None, minsize is used.

astype{‘jax’, ‘numpy’, ‘iterator’}

Specify the output type. Use either ‘jax’ get the data as a jax array [default], ‘numpy’ for NumPy array or ‘iterator’.

Returns:
combinationsarray_like

Combinations of features.

property multiplets#

Indices of the multiplets of shape (n_mult, maxsize).

By convention, we used -1 to indicate that a feature has been ignored.

property order#

Order of each multiplet of shape (n_mult,).

property undersampling#

Under-sampling threshold.

Examples using hoi.metrics.RedundancyphiID#

Integrated Information Decomposition

Integrated Information Decomposition