class RecurrentNeuralNetLayer extends AnyRef
The RecurrentNeuralNetLayer
is a 3-layer where x denotes the input,
'y 'denotes the output and 's' is the intermediate/hidden value.
We have 'St = Activate (U dot x(t) + W dot x(t-1))' and
'y(t) = softmax(V dot St)'.
- Alphabetic
- By Inheritance
- RecurrentNeuralNetLayer
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new RecurrentNeuralNetLayer()
Value Members
- var add: VectoD
-
def
backward(x: VectoD, prev_s: VectoD, u: MatriD, w: MatriD, v: MatriD, diff_s: VectoD, dmulv: VectoD): (VectoD, MatriD, MatriD, MatriD)
Calculate the derivate regarding to prev_s , U, W, V by backward of each unit
Calculate the derivate regarding to prev_s , U, W, V by backward of each unit
- x
the input data
- prev_s
record the previous hidden layer value
- u
parameter for input x
- w
parameter for hidden layer z
- v
parameter for output
- diff_s
diff_s = ds(t+1)/ ds (t)
- dmulv
dl/dmulv where l is the loss, mulv = V dot s
-
def
forward(x: VectoD, prev_s: VectoD, u: MatriD, w: MatriD, v: MatriD): Unit
Forward the x into the RecurrentNeuralNet layer.
Forward the x into the RecurrentNeuralNet layer. We have St = Activate (U dot x(t) + W dot x(t-1)) y(t) = softmax(V dot St)
- x
the input data
- prev_s
record the previous hidden layer value
- u
parameter for input x
- w
parameter for hidden layer z
- v
parameter for output
- var mulu: VectoD
- var mulv: VectoD
- var mulw: VectoD
- var s: VectoD