Indices based on information theory

Indices based on information theory, such as entropy, mutual information etc, can easily be computed. To this end, the ecological network is transformed in a bivariate distribution. This is done by normalizing the adjacency or incidence matrix to obtain a doubly stochastic matrix. The information theoretic indices are computed either from this matrix or directly from the ecological network. Note that when using an array is input, the functions do not perform any checks whether the matrix is normalized and nonnegative. When the input is an ecological network, the functions automatically convert the network to a normalized probability matrix.

One can compute individual indices or use the function information_decomposition which performs the entire decomposition at once. This decomposition yields for a given network the deviation of the marginal distributions of the species with the uniform distribution (quantifying the evenness), the mutual information (quantifying the specialisation) and the variance of information (quantifying the freedom and stability of the interactions). These indices satisfy the following balance equation for the top ($T$) and bottom ($B$) throphic level:

\[\log(nm) = D(B,T) + 2 I(B;T) + V(B;T)\]

\[\log(n) = D(B) + I(B;T) + H(B|T)\]

\[\log(m) = D(T) + I(B;T) + H(T|B)\]

Here, $n$ and $m$ are number of bottom and top species, respectively.

Indices can be calculated for the joint distribution, as well as for the marginal distributions of the two trophic levels (if applicable), by changing an optional argument dim=1 of the function.

Network conversion

EcologicalNetworks.make_joint_distributionFunction
make_joint_distribution(N::NT) where {NT<:AbstractEcologicalNetwork}

Returns a probability matrix computed from the adjacency or incidence matrix. Raises an error if the matrix contains negative values. Output in bits.

source

Indices

EcologicalNetworks.entropyFunction
entropy(P::AbstractArray; [dims])

Computes the joint entropy of a probability matrix. Does not perform any checks whether the matrix is normalized. Output in bits.

If the dims keyword argument is provided, the marginal entropy of the matrix is computed. dims indicates whether to compute the entropy for the rows (dims=1) or columns (dims=2).

source
entropy(N::AbstractEcologicalNetwork; [dims])

Computes the joint entropy of an ecological network. If dims is specified, The marginal entropy of the ecological network is computed. dims indicates whether to compute the entropy for the rows (dims=1) or columns (dims=2). Output in bits.

source
EcologicalNetworks.conditional_entropyFunction
conditional_entropy(P::AbstractArray, given::I)

Computes the conditional entropy of probability matrix. If given = 1, it is the entropy of the columns, and vise versa when given = 2. Output in bits.

Does not check whether P is a valid probability matrix.

source
conditional_entropy(N::AbstractEcologicalNetwork, given::I)

Computes the conditional entropy of an ecological network. If given = 1, it is the entropy of the columns, and vise versa when given = 2.

source
EcologicalNetworks.mutual_informationFunction
mutual_information(P::AbstractArray)

Computes the mutual information of a probability matrix. Output in bits.

source
mutual_information(N::NT) where {NT<:AbstractEcologicalNetwork}

Computes the mutual information of an ecological network. Output in bits.

source
EcologicalNetworks.variation_informationFunction
variation_information(P::AbstractArray)

Computes the variation of information of a double stochastic matrix. Output in bits.

source
variation_information(N::AbstractEcologicalNetwork)

Computes the variation of information of an ecological network. Output in bits.

source
EcologicalNetworks.potential_informationFunction
potential_information(N::NT; [dims])

Computes the maximal potential information in a network, corresponding to every species interacting with every other species. Compute result for the marginals using the optional parameter dims. Output in bits.

source
EcologicalNetworks.diff_entropy_uniformFunction
diff_entropy_uniform(P::AbstractArray; [dims])

Computes the difference in entropy of the marginals compared to the entropy of an uniform distribution. The parameter dims indicates which marginals are used, with both if no value is provided. Output in bits.

source
diff_entropy_uniform(N::AbstractEcologicalNetwork, dims::I=nothing)

Computes the difference in entropy of the marginals compared to the entropy of an uniform distribution. The parameter dims indicates which marginals are used, with both if no value is provided. Output in bits.

source

Decomposition

EcologicalNetworks.information_decompositionFunction
information_decomposition(N::AbstractEcologicalNetwork; norm::Bool=false, dims=nothing)

Performs an information theory decomposition of a given ecological network, i.e. the information content in the normalized adjacency matrix is split in:

  • :D : difference in entropy of marginals compared to an uniform distribition
  • :I : mutual information
  • :V : variation of information / conditional entropy

If norm=true, the components are normalized such that their sum is equal to 1. One can optinally give the dimision, indicating whether to compute the indices for the rows (dims=1), columns (dims=2) or the whole matrix (default).

Result is returned in a Dict. Outputs in bits.

Stock, M.; Hoebeke, L.; De Baets, B. « Disentangling the Information in Species Interaction Networks ». Entropy 2021, 23, 703. https://doi.org/10.3390/e23060703

source

Effective interactions

EcologicalNetworks.convert2effectiveFunction
convert2effective(indice::Real)

Convert an information theory indices in an effective number (i.e. number of corresponding interactions). Assumes an input in bits (i.e. log with base 2 is used).

source

References

Stock, M.; Hoebeke, L.; De Baets, B. « Disentangling the Information in Species Interaction Networks ». Entropy 2021, 23, 703. https://doi.org/10.3390/e23060703