LayerNorm

scalation.modeling.forecasting.neuralforecasting.LayerNorm
case class LayerNorm(atransform: Boolean, eps: Double)

The LayerNorm class will, in computing the output, normalize by subtracting the mean and dividing by the standard deviation.

Value parameters

atransform

whether to apply an affine transformation to standard normalization

eps

the small value to prevent division by zero

Attributes

See also

pytorch.org/docs/stable/generated/torch.nn.LayerNorm.html#torch.nn.LayerNorm

Graph
Supertypes
trait Serializable
trait Product
trait Equals
class Object
trait Matchable
class Any
Show all

Members list

Value members

Concrete methods

def apply(x: MatrixD): MatrixD

Forward pass: calculate the output of this layer.

Forward pass: calculate the output of this layer.

Value parameters

x

the m by nx input matrix (full or batch)

Attributes

def reset(w_: Double, b_: Double): Unit

Reset the weight and bias.

Reset the weight and bias.

Value parameters

b_

the new bias

w_

the new weight

Attributes

Inherited methods

def productElementNames: Iterator[String]

Attributes

Inherited from:
Product
def productIterator: Iterator[Any]

Attributes

Inherited from:
Product