JavaMI
Class Entropy

java.lang.Object
  extended by JavaMI.Entropy

public abstract class Entropy
extends java.lang.Object

Implements common discrete Shannon Entropy functions. Provides: univariate entropy H(X), conditional entropy H(X|Y), joint entropy H(X,Y). Defaults to log_2, and so the entropy is calculated in bits.


Field Summary
static double LOG_BASE
           
 
Method Summary
static double calculateConditionalEntropy(double[] dataVector, double[] conditionVector)
          Calculates the conditional entropy H(X|Y) from two vectors.
static double calculateEntropy(double[] dataVector)
          Calculates the univariate entropy H(X) from a vector.
static double calculateJointEntropy(double[] firstVector, double[] secondVector)
          Calculates the joint entropy H(X,Y) from two vectors.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

LOG_BASE

public static double LOG_BASE
Method Detail

calculateEntropy

public static double calculateEntropy(double[] dataVector)
Calculates the univariate entropy H(X) from a vector. Uses histograms to estimate the probability distributions, and thus the entropy. The entropy is bounded 0 ≤ H(X) ≤ log |X|, where log |X| is the log of the number of states in the random variable X.

Parameters:
dataVector - Input vector (X). It is discretised to the floor of each value before calculation.
Returns:
The entropy H(X).

calculateConditionalEntropy

public static double calculateConditionalEntropy(double[] dataVector,
                                                 double[] conditionVector)
Calculates the conditional entropy H(X|Y) from two vectors. X = dataVector, Y = conditionVector. Uses histograms to estimate the probability distributions, and thus the entropy. The conditional entropy is bounded 0 ≤ H(X|Y) ≤ H(X).

Parameters:
dataVector - Input vector (X). It is discretised to the floor of each value before calculation.
conditionVector - Input vector (Y). It is discretised to the floor of each value before calculation.
Returns:
The conditional entropy H(X|Y).

calculateJointEntropy

public static double calculateJointEntropy(double[] firstVector,
                                           double[] secondVector)
Calculates the joint entropy H(X,Y) from two vectors. The order of the input vectors is irrelevant. Uses histograms to estimate the probability distributions, and thus the entropy. The joint entropy is bounded 0 ≤ H(X,Y) ≤ log |XY|, where log |XY| is the log of the number of states in the joint random variable XY.

Parameters:
firstVector - Input vector. It is discretised to the floor of each value before calculation.
secondVector - Input vector. It is discretised to the floor of each value before calculation.
Returns:
The joint entropy H(X,Y).