Packages

c

scalation.analytics

RecurrentNeuralNetLayer

class RecurrentNeuralNetLayer extends AnyRef

The RecurrentNeuralNetLayer is a 3-layer where x denotes the input, 'y 'denotes the output and 's' is the intermediate/hidden value. We have 'St = Activate (U dot x(t) + W dot x(t-1))' and 'y(t) = softmax(V dot St)'.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. RecurrentNeuralNetLayer
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new RecurrentNeuralNetLayer()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. var add: VectoD
  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def backward(x: VectoD, prev_s: VectoD, u: MatriD, w: MatriD, v: MatriD, diff_s: VectoD, dmulv: VectoD): (VectoD, MatriD, MatriD, MatriD)

    Calculate the derivate regarding to prev_s , U, W, V by backward of each unit

    Calculate the derivate regarding to prev_s , U, W, V by backward of each unit

    x

    the input data

    prev_s

    record the previous hidden layer value

    u

    parameter for input x

    w

    parameter for hidden layer z

    v

    parameter for output

    diff_s

    diff_s = ds(t+1)/ ds (t)

    dmulv

    dl/dmulv where l is the loss, mulv = V dot s

  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  10. def forward(x: VectoD, prev_s: VectoD, u: MatriD, w: MatriD, v: MatriD): Unit

    Forward the x into the RecurrentNeuralNet layer.

    Forward the x into the RecurrentNeuralNet layer. We have St = Activate (U dot x(t) + W dot x(t-1)) y(t) = softmax(V dot St)

    x

    the input data

    prev_s

    record the previous hidden layer value

    u

    parameter for input x

    w

    parameter for hidden layer z

    v

    parameter for output

  11. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  12. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. var mulu: VectoD
  15. var mulv: VectoD
  16. var mulw: VectoD
  17. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  18. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  19. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  20. var s: VectoD
  21. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  22. def toString(): String
    Definition Classes
    AnyRef → Any
  23. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  25. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from AnyRef

Inherited from Any

Ungrouped