Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'X' given random variable 'Y', i.
Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'X' given random variable 'Y', i.e, P(X = x_i|Y = y_j).
the joint probability matrix
Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'Y' given random variable 'X', i.
Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'Y' given random variable 'X', i.e, P(Y = y_j|X = x_i).
the joint probability matrix
Given a joint probability matrix 'pxy' and a conditional probability matrix 'py_x', compute the "conditional entropy" of random variable 'X' given random variable 'Y'.
Given a joint probability matrix 'pxy' and a conditional probability matrix 'py_x', compute the "conditional entropy" of random variable 'X' given random variable 'Y'.
the joint probability matrix
the conditional probability matrix
Given a joint probability matrix 'pxy', compute the "joint entropy" of random variables 'X' and 'Y'.
Given a joint probability matrix 'pxy', compute the "joint entropy" of random variables 'X' and 'Y'.
the joint probability matrix
Given a probability vector 'px', compute the "entropy" of random variable 'X'.
Given a probability vector 'px', compute the "entropy" of random variable 'X'.
the probability vector
http://en.wikipedia.org/wiki/Entropy_%28information_theory%29
Given a probability vector 'px', compute the "base-k entropy" of random variable 'X'.
Given a probability vector 'px', compute the "base-k entropy" of random variable 'X'.
the probability vector
http://en.wikipedia.org/wiki/Entropy_%28information_theory%29
Show the flaw by printing the error message.
Show the flaw by printing the error message.
the method where the error occurred
the error message
Determine whether the matrix 'pxy' is a legitimate joint "probability matrix".
Determine whether the matrix 'pxy' is a legitimate joint "probability matrix". The elements of the matrix must be non-negative and add to one.
the probability matrix
Determine whether the vector 'px' is a legitimate "probability vector".
Determine whether the vector 'px' is a legitimate "probability vector". The elements of the vector must be non-negative and add to one.
the probability vector
Given two independent random variables 'X' and 'Y', compute their "joint probability", which is the outer product of their probability vectors 'px' and 'py', i.
Given two independent random variables 'X' and 'Y', compute their "joint probability", which is the outer product of their probability vectors 'px' and 'py', i.e., P(X = x_i, Y = y_j).
Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'X', i.
Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'X', i.e, P(X = x_i).
the probability matrix
Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'Y', i.
Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'Y', i.e, P(Y = y_j).
the probability matrix
Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.
Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.
the probability matrix
The
Probability
object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'. A probability distribution is specified by its probabilty mass functions (pmf) stored either as a "probabilty vector" for a univariate distribution or a "probability matrix" for a bivariate distribution.joint probability matrix: pxy(i, j) = P(X = x_i, Y = y_j) marginal probability vector: px(i) = P(X = x_i) conditional probability matrix: px_y(i, j) = P(X = x_i|Y = y_j)
In addition to computing joint, marginal and conditional probabilities, methods for computing entropy and mutual information are also provided. Entropy provides a measure of disorder or randomness. If there is little randomness, entropy will close to 0, while when randomness is high, entropy will be close to, e.g., log2 (px.dim). Mutual information provides a robust measure of dependency between random variables (constrast with correletion).
scalation.stat.StatVector