o

scalation.stat

Probability

object Probability extends Error

The Probability object provides methods for operating on univariate and bivariate probability distributions of discrete random variables 'X' and 'Y'. A probability distribution is specified by its probability mass functions (pmf) stored either as a "probability vector" for a univariate distribution or a "probability matrix" for a bivariate distribution.

joint probability matrix: pxy(i, j) = P(X = x_i, Y = y_j) marginal probability vector: px(i) = P(X = x_i) conditional probability matrix: px_y(i, j) = P(X = x_i|Y = y_j)

In addition to computing joint, marginal and conditional probabilities, methods for computing entropy and mutual information are also provided. Entropy provides a measure of disorder or randomness. If there is little randomness, entropy will close to 0, while when randomness is high, entropy will be close to, e.g., 'log2 (px.dim)'. Mutual information provides a robust measure of dependency between random variables (contrast with correlation).

See also

scalation.stat.StatVector

Linear Supertypes
Error, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Probability
  2. Error
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. def centropy(px: VectoD, qx: VectoD): Double

    Given probability vectors 'px' and 'qx', compute the "cross entropy".

    Given probability vectors 'px' and 'qx', compute the "cross entropy".

    px

    the first probability vector

    qx

    the second probability vector (requires qx.dim >= px.dim)

  2. def condProbX_Y(pxy: MatriD, py_: VectoD = null): MatriD

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'X' given random variable 'Y', i.e, P(X = x_i|Y = y_j).

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'X' given random variable 'Y', i.e, P(X = x_i|Y = y_j).

    pxy

    the joint probability matrix

    py_

    the marginal probability vector for Y

  3. def condProbY_X(pxy: MatriD, px_: VectoD = null): MatriD

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'Y' given random variable 'X', i.e, P(Y = y_j|X = x_i).

    Given a joint probability matrix 'pxy', compute the "conditional probability" for random variable 'Y' given random variable 'X', i.e, P(Y = y_j|X = x_i).

    pxy

    the joint probability matrix

    px_

    the marginal probability vector for X

  4. def count(x: VectoD, vl: Int, idx_: IndexedSeq[Int], cont: Boolean, thres: Double): Int

    Count the total number of occurrence in vector 'x' of value 'vl', e.g., 'x' is column 2 (Humidity), 'vl' is 1 (High) matches 7 rows.

    Count the total number of occurrence in vector 'x' of value 'vl', e.g., 'x' is column 2 (Humidity), 'vl' is 1 (High) matches 7 rows. This method works for vectors with integer or continuous values.

    x

    the feature/column vector (e.g., column j of matrix)

    vl

    one of the possible branch values for feature x (e.g., 1 (High))

    idx_

    the index positions within x (if null, use all index positions)

    cont

    whether feature/variable x is to be treated as continuous

    thres

    the splitting threshold for features/variables treated as continuous

  5. def entropy(pxy: MatriD, px_y: MatriD): Double

    Given a joint probability matrix 'pxy' and a conditional probability matrix 'py_x', compute the "conditional entropy" of random variable 'X' given random variable 'Y'.

    Given a joint probability matrix 'pxy' and a conditional probability matrix 'py_x', compute the "conditional entropy" of random variable 'X' given random variable 'Y'.

    pxy

    the joint probability matrix

    px_y

    the conditional probability matrix

  6. def entropy(pxy: MatriD): Double

    Given a joint probability matrix 'pxy', compute the "joint entropy" of random variables 'X' and 'Y'.

    Given a joint probability matrix 'pxy', compute the "joint entropy" of random variables 'X' and 'Y'.

    pxy

    the joint probability matrix

  7. def entropy(px: VectoD, b: Int): Double

    Given a probability vector 'px', compute the " base-k entropy" of random variable 'X'.

    Given a probability vector 'px', compute the " base-k entropy" of random variable 'X'.

    px

    the probability vector

    b

    the base for the logarithm

    See also

    http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

  8. def entropy(nu: VectoI): Double

    Given a frequency vector 'nu', compute the "entropy" of random variable 'X'.

    Given a frequency vector 'nu', compute the "entropy" of random variable 'X'.

    nu

    the frequency vector

    See also

    http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

  9. def entropy(px: VectoD): Double

    Given a probability vector 'px', compute the "entropy" of random variable 'X'.

    Given a probability vector 'px', compute the "entropy" of random variable 'X'.

    px

    the probability vector

    See also

    http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

  10. final def flaw(method: String, message: String): Unit
    Definition Classes
    Error
  11. def frequency(x: VectoD, y: VectoI, k: Int, vl: Int, idx_: IndexedSeq[Int], cont: Boolean, thres: Double): (Double, VectoI)

    Count the frequency of occurrence in vector 'x' of value 'vl' for each of 'y's classification values, e.g., 'x' is column 2 (Humidity), 'vl' is 1 (High)) and 'y' can be 0 (no) or 1 (yes).

    Count the frequency of occurrence in vector 'x' of value 'vl' for each of 'y's classification values, e.g., 'x' is column 2 (Humidity), 'vl' is 1 (High)) and 'y' can be 0 (no) or 1 (yes). Also, determine the fraction of training cases where the feature has this value (e.g., fraction where Humidity is High = 7/14). This method works for vectors with integer or continuous values.

    x

    the feature/column vector (e.g., column j of matrix)

    y

    the response/classification vector

    k

    the maximum value of y + 1

    vl

    one of the possible branch values for feature x (e.g., 1 (High))

    idx_

    the index positions within x (if null, use all index positions)

    cont

    whether feature/variable x is to be treated as continuous

    thres

    the splitting threshold for features/variables treated as continuous

  12. def frequency(x: VectoI, y: VectoI, k: Int, vl: Int, idx_: IndexedSeq[Int]): (Double, VectoI)

    Count the frequency of occurrence in vector 'x' of value 'vl' for each of 'y's classification values, e.g., 'x' is column 2 (Humidity), 'vl' is 1 (High)) and 'y' can be 0 (no) or 1 (yes).

    Count the frequency of occurrence in vector 'x' of value 'vl' for each of 'y's classification values, e.g., 'x' is column 2 (Humidity), 'vl' is 1 (High)) and 'y' can be 0 (no) or 1 (yes). Also, determine the fraction of training cases where the feature has this value (e.g., fraction where Humidity is High = 7/14).

    x

    the feature/column vector (e.g., column j of matrix)

    y

    the response/classification vector

    k

    the maximum value of y + 1

    vl

    one of the possible branch values for feature x (e.g., 1 (High))

    idx_

    the index positions within x (if null, use all index positions)

  13. def frequency(y: VectoI, k: Int, idx_: IndexedSeq[Int] = null): VectoI

    Count the frequency of occurrence of each distinct value within integer vector 'y', (e.g., result 'nu' = (5, 9) didn't play 5, played 9).

    Count the frequency of occurrence of each distinct value within integer vector 'y', (e.g., result 'nu' = (5, 9) didn't play 5, played 9). Restriction: 'y' may not contain negative integer values.

    y

    the feature/columne vector of integer values whose frequency counts are sought

    k

    the maximum value of y + 1

    idx_

    the index positions within y (if null, use all index positions)

  14. def isProbability(pxy: MatriD): Boolean

    Determine whether the matrix 'pxy' is a legitimate joint "probability matrix".

    Determine whether the matrix 'pxy' is a legitimate joint "probability matrix". The elements of the matrix must be non-negative and add to one.

    pxy

    the probability matrix

  15. def isProbability(px: VectoD): Boolean

    Determine whether the vector 'px' is a legitimate "probability vector".

    Determine whether the vector 'px' is a legitimate "probability vector". The elements of the vector must be non-negative and add to one.

    px

    the probability vector

  16. def jointProbXY(px: VectoD, py: VectoD): MatriD

    Given two independent random variables 'X' and 'Y', compute their "joint probability", which is the outer product of their probability vectors 'px' and 'py', i.e., P(X = x_i, Y = y_j).

  17. def logProb(px: VectoD): VectoD

    Given a probability vector 'px', compute the "log-probability".

    Given a probability vector 'px', compute the "log-probability". Requires each probability to be non-zero.

    px

    the probability vector

  18. def margProbX(pxy: MatriD): VectoD

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'X', i.e, P(X = x_i).

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'X', i.e, P(X = x_i).

    pxy

    the probability matrix

  19. def margProbY(pxy: MatriD): VectoD

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'Y', i.e, P(Y = y_j).

    Given a joint probability matrix 'pxy', compute the "marginal probability" for random variable 'Y', i.e, P(Y = y_j).

    pxy

    the probability matrix

  20. def muInfo(pxy: MatriD): Double

    Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.

    Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.

    pxy

    the probability matrix

  21. def muInfo(pxy: MatriD, px: VectoD, py: VectoD): Double

    Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.

    Given a joint probability matrix 'pxy', compute the mutual information for random variables 'X' and 'Y'.

    pxy

    the probability matrix

    px

    the marginal probability vector for X

    py

    the marginal probability vector for Y

  22. def nentropy(px: VectoD): Double

    Given a probability vector 'px', compute the "normalized entropy" of random variable 'X'.

    Given a probability vector 'px', compute the "normalized entropy" of random variable 'X'.

    px

    the probability vector

    See also

    http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

  23. def rentropy(px: VectoD, qx: VectoD): Double

    Given probability vectors 'px' and 'qx', compute the "relative entropy".

    Given probability vectors 'px' and 'qx', compute the "relative entropy".

    px

    the first probability vector

    qx

    the second probability vector (requires qx.dim >= px.dim)

  24. def toProbability(nu: MatriI, n: Int): MatriD

    Given a frequency matrix, convert it to a probability matrix.

    Given a frequency matrix, convert it to a probability matrix.

    nu

    the frequency matrix

    n

    the total number of instances/trials collected

  25. def toProbability(nu: MatriI): MatriD

    Given a frequency matrix, convert it to a probability matrix.

    Given a frequency matrix, convert it to a probability matrix.

    nu

    the frequency matrix

  26. def toProbability(nu: VectoI, n: Int): VectoD

    Given a frequency vector, convert it to a probability vector.

    Given a frequency vector, convert it to a probability vector.

    nu

    the frequency vector

    n

    the total number of instances/trials collected

  27. def toProbability(nu: VectoI): VectoD

    Given a frequency vector, convert it to a probability vector.

    Given a frequency vector, convert it to a probability vector.

    nu

    the frequency vector