ActivationFun

scalation.modeling.ActivationFun
object ActivationFun

The ActivationFun object contains common Activation functions and provides both scalar and vector versions.

Attributes

See also
Graph
Supertypes
class Object
trait Matchable
class Any
Self type

Members list

Value members

Concrete methods

def eLU(t: Double): Double

Compute the value of the Exponential Linear Unit eLU function at scalar t.

Compute the value of the Exponential Linear Unit eLU function at scalar t.

Value parameters

t

the eLU function argument

Attributes

def eLUD(yp: VectorD): VectorD

Compute the derivative vector for eLU function at vector yp where yp is pre-computed by yp = eLU_ (t).

Compute the derivative vector for eLU function at vector yp where yp is pre-computed by yp = eLU_ (t).

Value parameters

yp

the derivative function vector argument

Attributes

def gaussian(t: Double): Double

Compute the value of the Gaussian function at scalar t.

Compute the value of the Gaussian function at scalar t.

Value parameters

t

the Gaussian function argument

Attributes

Compute the derivative vector for Gaussian function at vector yp where yp is pre-computed by yp = gaussian_ (t).

Compute the derivative vector for Gaussian function at vector yp where yp is pre-computed by yp = gaussian_ (t).

Value parameters

t

the domain value for the function

yp

the derivative function vector argument

Attributes

def geLU(t: Double): Double

Approximately compute the value of the geLU function at t.

Approximately compute the value of the geLU function at t.

Value parameters

t

the geLU function argument

Attributes

def geLUd(t: Double): Double

Compute the derivative vector for geLU function at vector yp where yp is pre-computed by yp = geLU_ (t).

Compute the derivative vector for geLU function at vector yp where yp is pre-computed by yp = geLU_ (t).

Value parameters

t

the domain value for the function

Attributes

def id(t: Double): Double

Compute the value of the Identity id function at scalar t.

Compute the value of the Identity id function at scalar t.

Value parameters

t

the id function argument

Attributes

def idD(yp: VectorD): VectorD

Compute the derivative vector for id function at vector yp where yp is pre-computed by yp = id (t).

Compute the derivative vector for id function at vector yp where yp is pre-computed by yp = id (t).

Value parameters

yp

the derivative function vector argument

Attributes

def id_(t: VectorD): VectorD
def logistic(t: Double, a: Double, b: Double, c: Double): Double

Compute the value of the Logistic function at scalar t. With the default settings, it is identical to sigmoid. Note, it is not typically used as an activation function

Compute the value of the Logistic function at scalar t. With the default settings, it is identical to sigmoid. Note, it is not typically used as an activation function

Value parameters

a

the shift parameter (1 => mid at 0, <1 => mid shift left, >= mid shift right

b

the spread parameter (1 => sigmoid rate, <1 => slower than, >1 => faster than) althtough typically positive, a negative b will cause the function to decrease

c

the scale parameter (range is 0 to c)

t

the logistic function argument

Attributes

See also
def logistic_(t: VectorD, a: Double, b: Double, c: Double): VectorD
def logit(p: Double): Double

Compute the log of the odds (Logit) of an event occurring (e.g., success, 1). The inverse of the logit function is the standard logistic function (sigmoid function). Note, it is not typically used as an activation function

Compute the log of the odds (Logit) of an event occurring (e.g., success, 1). The inverse of the logit function is the standard logistic function (sigmoid function). Note, it is not typically used as an activation function

Value parameters

p

the probability, a number between 0 and 1.

Attributes

def lreLU(t: Double): Double

Compute the value of the Leaky Rectified Linear Unit lreLU function at scalar t.

Compute the value of the Leaky Rectified Linear Unit lreLU function at scalar t.

Value parameters

t

the lreLU function argument

Attributes

def lreLUD(yp: VectorD): VectorD

Compute the derivative vector for lreLU function at vector yp where yp is pre-computed by yp = lreLU_ (t).

Compute the derivative vector for lreLU function at vector yp where yp is pre-computed by yp = lreLU_ (t).

Value parameters

yp

the derivative function vector argument

Attributes

def reLU(t: Double): Double

Compute the value of the Rectified Linear Unit reLU function at scalar t.

Compute the value of the Rectified Linear Unit reLU function at scalar t.

Value parameters

t

the reLU function argument

Attributes

def reLUD(yp: VectorD): VectorD

Compute the derivative vector for reLU function at vector yp where yp is pre-computed by yp = reLU_ (t).

Compute the derivative vector for reLU function at vector yp where yp is pre-computed by yp = reLU_ (t).

Value parameters

yp

the derivative function vector argument

Attributes

def rescaleX(x: MatrixD, f: AFF): MatrixD

Rescale the input/data matrix x to the arange (active range) of the "first" activation function f; otherwise normalize. Return the rescaled matrix.

Rescale the input/data matrix x to the arange (active range) of the "first" activation function f; otherwise normalize. Return the rescaled matrix.

Value parameters

f

the activation function family (first)

x

the input/data matrix

Attributes

Rescale the output/response vector y to the bounds of the "last" activation function f; otherwise normalize. Return the rescaled vector and the rescaling inverse function.

Rescale the output/response vector y to the bounds of the "last" activation function f; otherwise normalize. Return the rescaled vector and the rescaling inverse function.

Value parameters

f

the activation function family (last)

y

the output/response vector

Attributes

Rescale the output/response matrix y to the bounds of the "last" activation function f; otherwise normalize. Return the rescaled matrix and the rescaling inverse function.

Rescale the output/response matrix y to the bounds of the "last" activation function f; otherwise normalize. Return the rescaled matrix and the rescaling inverse function.

Value parameters

f

the activation function family (last layer)

y

the output/response matrix

Attributes

def setA(a_: Double): Unit

Set the lreLU a (alpha) parameter for the Leaky Rectified Linear Unit functions.

Set the lreLU a (alpha) parameter for the Leaky Rectified Linear Unit functions.

Value parameters

a

the rleLU alpha parameter (0, 1] indicating how leaky the function is

Attributes

def setA2(a_: Double): Unit

Set the eLU a2 (alpha) parameter for the Exponential Linear Unit functions.

Set the eLU a2 (alpha) parameter for the Exponential Linear Unit functions.

Value parameters

a_

the eLU alpha parameter (0, infinity) indicating how leaky the function is

Attributes

def sigmoid(t: Double): Double

Compute the value of the Sigmoid function at t. This is a special case of the logistic function, where a = 0 and b = 1. It is also referred to as the standard logistic function. It is also the inverse of the logit function.

Compute the value of the Sigmoid function at t. This is a special case of the logistic function, where a = 0 and b = 1. It is also referred to as the standard logistic function. It is also the inverse of the logit function.

Value parameters

t

the sigmoid function argument

Attributes

Compute the derivative vector for sigmoid function at vector yp where yp is pre-computed by yp = sigmoid_ (t).

Compute the derivative vector for sigmoid function at vector yp where yp is pre-computed by yp = sigmoid_ (t).

Value parameters

yp

the derivative function vector argument

Attributes

Compute the derivative vector for the Softmax function at vector yp where yp is pre-computed by yp = softmax_ (t).

Compute the derivative vector for the Softmax function at vector yp where yp is pre-computed by yp = softmax_ (t).

Value parameters

yp

the derivative function vector argument

Attributes

Compute the derivative matrix (Jacobian) for Softmax function at vector yp where yp is pre-computed by yp = softmax_ (t).

Compute the derivative matrix (Jacobian) for Softmax function at vector yp where yp is pre-computed by yp = softmax_ (t).

Value parameters

yp

the derivative function vector argument

Attributes

See also

Compute the vector of values of the Softmax function applied to vector t.

Compute the vector of values of the Softmax function applied to vector t.

Value parameters

t

the softmax function vector argument

Attributes

See also

https://en.wikipedia.org/wiki/Softmax_function Note, scalar function version softmax is not needed.

def tanhD(yp: VectorD): VectorD

Compute the derivative vector for tanh function at vector yp where yp is pre-computed by yp = tanh_ (t).

Compute the derivative vector for tanh function at vector yp where yp is pre-computed by yp = tanh_ (t).

Value parameters

yp

the derivative function vector argument

Attributes

def tanh_(t: VectorD): VectorD

Compute the vector of values of the tanh function applied to vector t.

Compute the vector of values of the tanh function applied to vector t.

Value parameters

t

the tanh function vector argument

Attributes