Packages

o

scalation.analytics

Probability

object Probability extends Error

The Probability object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'. A probability distribution is specified by its probability mass functions (pmf) stored either as a "probability vector" for a univariate distribution or a "probability matrix" for a bivariate distribution.

joint probability matrix: pxy(i, j) = P(X = x_i, Y = y_j) marginal probability vector: px(i) = P(X = x_i) conditional probability matrix: px_y(i, j) = P(X = x_i|Y = y_j)

In addition to computing joint, marginal and conditional probabilities, methods for computing entropy and mutual information are also provided. Entropy provides a measure of disorder or randomness. If there is little randomness, entropy will close to 0, while when randomness is high, entropy will be close to, e.g., 'log2 (px.dim)'. Mutual information provides a robust measure of dependency between random variables (contrast with correlation).

See also

scalation.stat.StatVector

Linear Supertypes
Error, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Probability
  2. Error
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. def condProbX_Y(pxy: MatriD): MatriD

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'X' given random variable 'Y', i.e, P(X = x_i|Y = y_j).

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'X' given random variable 'Y', i.e, P(X = x_i|Y = y_j).

    pxy

    the joint probability matrix

  2. def condProbY_X(pxy: MatriD): MatriD

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'Y' given random variable 'X', i.e, P(Y = y_j|X = x_i).

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'Y' given random variable 'X', i.e, P(Y = y_j|X = x_i).

    pxy

    the joint probability matrix

  3. def entropy(pxy: MatriD, px_y: MatriD): Double

    Given a joint probability matrix 'pxy' and a conditional probability matrix 'py_x', compute the "conditional entropy" of random variable 'X' given random variable 'Y'.

    Given a joint probability matrix 'pxy' and a conditional probability matrix 'py_x', compute the "conditional entropy" of random variable 'X' given random variable 'Y'.

    pxy

    the joint probability matrix

    px_y

    the conditional probability matrix

  4. def entropy(pxy: MatriD): Double

    Given a joint probability matrix 'pxy', compute the "joint entropy" of random variables 'X' and 'Y'.

    Given a joint probability matrix 'pxy', compute the "joint entropy" of random variables 'X' and 'Y'.

    pxy

    the joint probability matrix

  5. def entropy(px: VectoD): Double

    Given a probability vector 'px', compute the "entropy" of random variable 'X'.

    Given a probability vector 'px', compute the "entropy" of random variable 'X'.

    px

    the probability vector

    See also

    http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

  6. def entropy_k(px: VectoD): Double

    Given a probability vector 'px', compute the "base-k entropy" of random variable 'X'.

    Given a probability vector 'px', compute the "base-k entropy" of random variable 'X'.

    px

    the probability vector

    See also

    http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

  7. final def flaw(method: String, message: String): Unit
    Definition Classes
    Error
  8. def isProbability(pxy: MatriD): Boolean

    Determine whether the matrix 'pxy' is a legitimate joint "probability matrix".

    Determine whether the matrix 'pxy' is a legitimate joint "probability matrix". The elements of the matrix must be non-negative and add to one.

    pxy

    the probability matrix

  9. def isProbability(px: VectoD): Boolean

    Determine whether the vector 'px' is a legitimate "probability vector".

    Determine whether the vector 'px' is a legitimate "probability vector". The elements of the vector must be non-negative and add to one.

    px

    the probability vector

  10. def jointProbXY(px: VectoD, py: VectoD): MatriD

    Given two independent random variables 'X' and 'Y', compute their "joint probability", which is the outer product of their probability vectors 'px' and 'py', i.e., P(X = x_i, Y = y_j).

  11. def margProbX(pxy: MatriD): VectoD

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'X', i.e, P(X = x_i).

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'X', i.e, P(X = x_i).

    pxy

    the probability matrix

  12. def margProbY(pxy: MatriD): VectoD

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'Y', i.e, P(Y = y_j).

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'Y', i.e, P(Y = y_j).

    pxy

    the probability matrix

  13. def muInfo(pxy: MatriD): Double

    Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.

    Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.

    pxy

    the probability matrix