JavaMI
Class MutualInformation

java.lang.Object
  extended by JavaMI.MutualInformation

public abstract class MutualInformation
extends java.lang.Object

Implements common discrete Mutual Information functions. Provides: Mutual Information I(X;Y), Conditional Mutual Information I(X,Y|Z). Defaults to log_2, and so the entropy is calculated in bits.


Method Summary
static double calculateConditionalMutualInformation(double[] firstVector, double[] secondVector, double[] conditionVector)
          Calculates the conditional Mutual Information I(X;Y|Z) between two random variables, conditioned on a third.
static double calculateMutualInformation(double[] firstVector, double[] secondVector)
          Calculates the Mutual Information I(X;Y) between two random variables.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Method Detail

calculateMutualInformation

public static double calculateMutualInformation(double[] firstVector,
                                                double[] secondVector)
Calculates the Mutual Information I(X;Y) between two random variables. Uses histograms to estimate the probability distributions, and thus the information. The mutual information is bounded 0 ≤ I(X;Y) ≤ min(H(X),H(Y)). It is also symmetric, so I(X;Y) = I(Y;X).

Parameters:
firstVector - Input vector (X). It is discretised to the floor of each value before calculation.
secondVector - Input vector (Y). It is discretised to the floor of each value before calculation.
Returns:
The Mutual Information I(X;Y).

calculateConditionalMutualInformation

public static double calculateConditionalMutualInformation(double[] firstVector,
                                                           double[] secondVector,
                                                           double[] conditionVector)
Calculates the conditional Mutual Information I(X;Y|Z) between two random variables, conditioned on a third. Uses histograms to estimate the probability distributions, and thus the information. The conditional mutual information is bounded 0 ≤ I(X;Y) ≤ min(H(X|Z),H(Y|Z)). It is also symmetric, so I(X;Y|Z) = I(Y;X|Z).

Parameters:
firstVector - Input vector (X). It is discretised to the floor of each value before calculation.
secondVector - Input vector (Y). It is discretised to the floor of each value before calculation.
conditionVector - Input vector (Z). It is discretised to the floor of each value before calculation.
Returns:
The conditional Mutual Information I(X;Y|Z).