Packages

c

scalation.analytics

RecurrentNeuralNetLayer

class RecurrentNeuralNetLayer extends AnyRef

The RecurrentNeuralNetLayer is a 3-layer where x denotes the input, 'y 'denotes the output and 's' is the intermediate/hidden value. We have 'St = Activate (U dot x(t) + W dot x(t-1))' and 'y(t) = softmax(V dot St)'.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. RecurrentNeuralNetLayer
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new RecurrentNeuralNetLayer()

Value Members

  1. var add: VectoD
  2. def backward(x: VectoD, prev_s: VectoD, u: MatriD, w: MatriD, v: MatriD, diff_s: VectoD, dmulv: VectoD): (VectoD, MatriD, MatriD, MatriD)

    Calculate the derivate regarding to prev_s , U, W, V by backward of each unit

    Calculate the derivate regarding to prev_s , U, W, V by backward of each unit

    x

    the input data

    prev_s

    record the previous hidden layer value

    u

    parameter for input x

    w

    parameter for hidden layer z

    v

    parameter for output

    diff_s

    diff_s = ds(t+1)/ ds (t)

    dmulv

    dl/dmulv where l is the loss, mulv = V dot s

  3. def forward(x: VectoD, prev_s: VectoD, u: MatriD, w: MatriD, v: MatriD): Unit

    Forward the x into the RecurrentNeuralNet layer.

    Forward the x into the RecurrentNeuralNet layer. We have St = Activate (U dot x(t) + W dot x(t-1)) y(t) = softmax(V dot St)

    x

    the input data

    prev_s

    record the previous hidden layer value

    u

    parameter for input x

    w

    parameter for hidden layer z

    v

    parameter for output

  4. var mulu: VectoD
  5. var mulv: VectoD
  6. var mulw: VectoD
  7. var s: VectoD