scalation.modeling.forecasting.neuralforecasting
Members list
Type members
Classlikes
The Attention
trait provides methods for computing context vectors, single-head attention matrices and multi-head attention matrices.
The Attention
trait provides methods for computing context vectors, single-head attention matrices and multi-head attention matrices.
Value parameters
- heads
-
the number of attention heads
- n_mod
-
the size of the output (dimensionality of the model, d_model)
- n_v
-
the size of the value vectors
- n_var
-
the size of the input vector x_t (number of variables)
Attributes
- Companion
- object
- Supertypes
-
class Objecttrait Matchableclass Any
- Known subtypes
-
class TrEncoderLayer
The Attention
object contains sample a input matrix from
The Attention
object contains sample a input matrix from
Attributes
- See also
-
https://sebastianraschka.com/blog/2023/self-attention-from-scratch.html The example is from 6 words with 16 dimensional encoding.
- Companion
- trait
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
Attention.type
The DenseLayer
class applies an (optionally activated) linear transformation to the input matrix X. Yp = f(X W + b) When f is null, it acts as a Linear Layer.
The DenseLayer
class applies an (optionally activated) linear transformation to the input matrix X. Yp = f(X W + b) When f is null, it acts as a Linear Layer.
Value parameters
- f
-
the activation function family for layers 1->2 (input to output)
- n_x
-
the second dimension of the input matrix (m by n_x)
- n_y
-
the second dimension of the output matrix (m by n_y)
Attributes
- See also
-
pytorch.org/docs/stable/generated/torch.nn.Linear.html#torch.nn.Linear
- Supertypes
-
trait Serializabletrait Producttrait Equalsclass Objecttrait Matchableclass AnyShow all
The DropoutLayer
class will, in computing the output, set each element to zero with probability p; otherwise, multiply it by a scale factor.
The DropoutLayer
class will, in computing the output, set each element to zero with probability p; otherwise, multiply it by a scale factor.
Value parameters
- p
-
the probability of setting an element to zero
Attributes
- See also
-
pytorch.org/docs/stable/generated/torch.nn.Dropout.html#torch.nn.Dropout
- Supertypes
-
trait Serializabletrait Producttrait Equalsclass Objecttrait Matchableclass AnyShow all
The GRU
class implements Gated Recurrent Unit (GRU) via Back Propagation Through Time (BPTT). At each time point x_t, there is a vector representing several variables or the encoding of a word. Intended to work for guessing the next work in a sentence or for multi-horizon forecasting. Time series: (x_t: t = 0, 1, ..., n_seq-1) where n_seq is the number of time points/words
The GRU
class implements Gated Recurrent Unit (GRU) via Back Propagation Through Time (BPTT). At each time point x_t, there is a vector representing several variables or the encoding of a word. Intended to work for guessing the next work in a sentence or for multi-horizon forecasting. Time series: (x_t: t = 0, 1, ..., n_seq-1) where n_seq is the number of time points/words
Value parameters
- fname
-
the feature/variable names
- n_mem
-
the size for hidden state (h) (dimensionality of memory)
- x
-
the input sequence/time series
- y
-
the output sequence/time series
Attributes
- Companion
- object
- Supertypes
-
class Objecttrait Matchableclass Any
The Gate
case class holds information on the gate's value and its partial derivatives.
The Gate
case class holds information on the gate's value and its partial derivatives.
Value parameters
- n_mem
-
the size for hidden state (h) (dimensionality of memory)
- n_seq
-
the length of the time series
- n_var
-
the number of variables
Attributes
- Supertypes
-
trait Serializabletrait Producttrait Equalsclass Objecttrait Matchableclass AnyShow all
The LSTM
class implements Long Short-Term Memeory (LSTM) via Back Propagation Through Time (BPTT). At each time point x_t, there is a vector representing several variables or the encoding of a word. Intended to work for guessing the next work in a sentence or for multi-horizon forecasting. Time series: (x_t: t = 0, 1, ..., n_seq-1) where n_seq is the number of time points/words
The LSTM
class implements Long Short-Term Memeory (LSTM) via Back Propagation Through Time (BPTT). At each time point x_t, there is a vector representing several variables or the encoding of a word. Intended to work for guessing the next work in a sentence or for multi-horizon forecasting. Time series: (x_t: t = 0, 1, ..., n_seq-1) where n_seq is the number of time points/words
Value parameters
- fname
-
the feature/variable names
- n_mem
-
the size for hidden state (h) (dimensionality of memory)
- x
-
the input sequence/time series
- y
-
the output sequence/time series
Attributes
- Companion
- object
- Supertypes
-
class Objecttrait Matchableclass Any
The LayerNorm
class will, in computing the output, normalize by subtracting the mean and dividing by the standard deviation.
The LayerNorm
class will, in computing the output, normalize by subtracting the mean and dividing by the standard deviation.
Value parameters
- atransform
-
whether to apply an affine transformation to standard normalization
- eps
-
the small value to prevent division by zero
Attributes
- See also
-
pytorch.org/docs/stable/generated/torch.nn.LayerNorm.html#torch.nn.LayerNorm
- Supertypes
-
trait Serializabletrait Producttrait Equalsclass Objecttrait Matchableclass AnyShow all
The NeuralNet_3L4TS
object supports 3-layer regression-like neural networks for Time Series data. Given a response vector y, a predictor matrix x is built that consists of lagged y vectors. y_t = f2 (b dot f(a dot x)) where x = [y_{t-1}, y_{t-2}, ... y_{t-lags}].
The NeuralNet_3L4TS
object supports 3-layer regression-like neural networks for Time Series data. Given a response vector y, a predictor matrix x is built that consists of lagged y vectors. y_t = f2 (b dot f(a dot x)) where x = [y_{t-1}, y_{t-2}, ... y_{t-lags}].
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
NeuralNet_3L4TS.type
The NeuralNet_XL4TS
object supports X-layer regression-like neural networks for Time Series data. Given a response vector y, a predictor matrix x is built that consists of lagged y vectors. y_t = f2 (b dot f(a dot x)) where x = [y_{t-1}, y_{t-2}, ... y_{t-lags}].
The NeuralNet_XL4TS
object supports X-layer regression-like neural networks for Time Series data. Given a response vector y, a predictor matrix x is built that consists of lagged y vectors. y_t = f2 (b dot f(a dot x)) where x = [y_{t-1}, y_{t-2}, ... y_{t-lags}].
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
NeuralNet_XL4TS.type
The PositionalEnc
trait provides methods to convert a time t into an encoded vector. An encoded vector consists of numbers in [-1.0, 1.0]. It implements Absolute Fixed Vanilla Positional Encoding.
The PositionalEnc
trait provides methods to convert a time t into an encoded vector. An encoded vector consists of numbers in [-1.0, 1.0]. It implements Absolute Fixed Vanilla Positional Encoding.
Value parameters
- d
-
the dimensionality of the positional encoding (except for f0)
- m
-
the length of the time series (number of time points)
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
The RMSNorm
class will, in computing the output, normalize by dividing by the Root Mean Square (RMS).
The RMSNorm
class will, in computing the output, normalize by dividing by the Root Mean Square (RMS).
Attributes
- Supertypes
-
trait Serializabletrait Producttrait Equalsclass Objecttrait Matchableclass AnyShow all
The RNN
class implements Recurrent Neural Network (RNN) via Back Propagation Through Time (BPTT). At each time point x_t, there is a vector representing several variables or the encoding of a word. Intended to work for guessing the next work in a sentence or for multi-horizon forecasting. Time series: (x_t: t = 0, 1, ..., n_seq-1) where n_seq is the number of time points/words
The RNN
class implements Recurrent Neural Network (RNN) via Back Propagation Through Time (BPTT). At each time point x_t, there is a vector representing several variables or the encoding of a word. Intended to work for guessing the next work in a sentence or for multi-horizon forecasting. Time series: (x_t: t = 0, 1, ..., n_seq-1) where n_seq is the number of time points/words
Value parameters
- fname
-
the feature/variable names
- n_mem
-
the size for hidden state (h) (dimensionality of memory)
- x
-
the input sequence/time series
- y
-
the output sequence/time series
Attributes
- Companion
- object
- Supertypes
-
class Objecttrait Matchableclass Any
The TrEncoderLayer
class consists of a Multi-Head Self-Attention and a Feed-Forward Neural Network (FFNN) sub-layers.
The TrEncoderLayer
class consists of a Multi-Head Self-Attention and a Feed-Forward Neural Network (FFNN) sub-layers.
Value parameters
- f
-
the activation function family (used by alinear1)
- heads
-
the number of attention heads
- n_mod
-
the size of the output (dimensionality of the model, d_model)
- n_v
-
the size of the value vectors
- n_var
-
the size of the input vector x_t (number of variables)
- n_z
-
the size of the hidden layer in the Feed-Forward Neural Network
- norm_eps
-
a small values used in normalization to avoid divide by zero
- norm_first
-
whether layer normalization should be done first (see apply method)
- p_drop
-
the probability of setting an element to zero in a dropout layer
Attributes
- See also
-
pytorch.org/docs/stable/generated/torch.nn.TransformerEncoderLayer.html#torch.nn.TransformerEncoderLayer
- Supertypes
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Value members
Concrete methods
The attentionTest
main function tests the context
and attention
top-level functions. Test Self-Attention.
The attentionTest
main function tests the context
and attention
top-level functions. Test Self-Attention.
runMain scalation.modeling.forecasting.neuralforecasting.attentionTest
Attributes
The attentionTest2
main function tests the attentionMH
top-level function. Test Multi-Head, Self-Attention.
The attentionTest2
main function tests the attentionMH
top-level function. Test Multi-Head, Self-Attention.
runMain scalation.modeling.forecasting..neuralforecastingattentionTest2
Attributes
The attentionTest3
main function tests the attention
and context
top-level function. Test Self-Attention. Read in weight matrices to compare with PyTorch.
The attentionTest3
main function tests the attention
and context
top-level function. Test Self-Attention. Read in weight matrices to compare with PyTorch.
runMain scalation.modeling.forecasting.neuralforecasting.attentionTest3
Attributes
The attentionTest4
main function tests the attentionMH
top-level function. Test Multi-Head, Self-Attention. Read in weight matrices to compare with PyTorch.
The attentionTest4
main function tests the attentionMH
top-level function. Test Multi-Head, Self-Attention. Read in weight matrices to compare with PyTorch.
runMain scalation.modeling.forecasting.neuralforecasting.attentionTest4
Attributes
The gRUTest
main function tests the GRU
class on randomly generated sequence data meant to represent encoded words
The gRUTest
main function tests the GRU
class on randomly generated sequence data meant to represent encoded words
runMain scalation.modeling.forecasting.neuralforecasting.gRUTest
Attributes
The gRUTest2
main function tests the GRU
class on sequence data read as words in a file that encoded and pass into GRU
The gRUTest2
main function tests the GRU
class on sequence data read as words in a file that encoded and pass into GRU
runMain scalation.modeling.forecasting.neuralforecasting.gRUTest2
Attributes
The gRUTest3
main function tests the GRU
class on sequence/time series data corresponding to the lake level dataset using multiple lags.
The gRUTest3
main function tests the GRU
class on sequence/time series data corresponding to the lake level dataset using multiple lags.
runMain scalation.modeling.forecasting.neuralforecasting.gRUTest3
Attributes
Generate a fake sequence dataset: generate only one sentence for training. Only for testing. Needs to be changed to read in training data from files. The words are one-hot encoded into a column vector.
Generate a fake sequence dataset: generate only one sentence for training. Only for testing. Needs to be changed to read in training data from files. The words are one-hot encoded into a column vector.
Value parameters
- n_seq
-
the sequence size (number of time points/words)
- n_var
-
the number of variables/word encoding size
Attributes
The lSTMTest
main function tests the LSTM
class on randomly generated sequence data meant to represent encoded words
The lSTMTest
main function tests the LSTM
class on randomly generated sequence data meant to represent encoded words
runMain scalation.modeling.forecasting.neuralforecasting.lSTMTest
Attributes
The lSTMTest2
main function tests the LSTM
class on sequence data read as words in a file that encoded and pass into LSTM
The lSTMTest2
main function tests the LSTM
class on sequence data read as words in a file that encoded and pass into LSTM
runMain scalation.modeling.forecasting.neuralforecasting.lSTMTest2
Attributes
The lSTMTest3
main function tests the LSTM
class on sequence/time series data corresponding to the lake level dataset using multiple lags.
The lSTMTest3
main function tests the LSTM
class on sequence/time series data corresponding to the lake level dataset using multiple lags.
runMain scalation.modeling.forecasting.neuralforecasting.lSTMTest3
Attributes
The neuralNet_3L4TSTest
main function tests the NeuralNet_3L4TS
class. This test is used to CHECK that the buildMatrix4TS function is working correctly. May get NaN for some maximum lags (p) due to multi-collinearity.
The neuralNet_3L4TSTest
main function tests the NeuralNet_3L4TS
class. This test is used to CHECK that the buildMatrix4TS function is working correctly. May get NaN for some maximum lags (p) due to multi-collinearity.
runMain scalation.modeling.forecasting.neuralforecasting.neuralNet_3L4TSTest
Attributes
The neuralNet_3L4TSTest2
main function tests the NeuralNet_3L4TS
class on real data: Forecasting lake levels.
The neuralNet_3L4TSTest2
main function tests the NeuralNet_3L4TS
class on real data: Forecasting lake levels.
Attributes
- See also
-
cran.r-project.org/web/packages/fpp/fpp.pdf
runMain scalation.modeling.forecasting.neuralforecasting.neuralNet_3L4TSTest2
The neuralNet_3L4TSTest4
main function tests the NeuralNet_3L4TS
class on real data: Forecasting COVID-19 Weekly Data.
The neuralNet_3L4TSTest4
main function tests the NeuralNet_3L4TS
class on real data: Forecasting COVID-19 Weekly Data.
runMain scalation.modeling.forecasting.neuralforecasting.neuralNet_3L4TSTest4
Attributes
The neuralNet_XL4TSTest
main function tests the NeuralNet_XL4TS
class. This test is used to CHECK that the buildMatrix4TS function is working correctly. May get NaN for some maximum lags (p) due to multi-collinearity.
The neuralNet_XL4TSTest
main function tests the NeuralNet_XL4TS
class. This test is used to CHECK that the buildMatrix4TS function is working correctly. May get NaN for some maximum lags (p) due to multi-collinearity.
runMain scalation.modeling.forecasting.neuralforecasting.neuralNet_XL4TSTest
Attributes
The neuralNet_XL4TSTest2
main function tests the NeuralNet_XL4TS
class on real data: Forecasting lake levels.
The neuralNet_XL4TSTest2
main function tests the NeuralNet_XL4TS
class on real data: Forecasting lake levels.
Attributes
- See also
-
cran.r-project.org/web/packages/fpp/fpp.pdf
runMain scalation.modeling.forecasting.neuralforecasting.neuralNet_XL4TSTest2
The neuralNet_XL4TSTest4
main function tests the NeuralNet_XL4TS
class on real data: Forecasting COVID-19 Weekly Data.
The neuralNet_XL4TSTest4
main function tests the NeuralNet_XL4TS
class on real data: Forecasting COVID-19 Weekly Data.
runMain scalation.modeling.forecasting.neuralforecasting.neuralNet_XL4TSTest4
Attributes
The positionalEncTest
main function is used to test the PositionalEnc
class.
The positionalEncTest
main function is used to test the PositionalEnc
class.
runMain scalation.modeling.forecasting.neuralforecasting.positionalEncTest
Attributes
The rNNTest
main function tests the RNN
class on randomly generated sequence data meant to represent encoded words
The rNNTest
main function tests the RNN
class on randomly generated sequence data meant to represent encoded words
runMain scalation.modeling.forecasting.neuralforecasting.rNNTest
Attributes
The rNNTest2
main function tests the RNN
class on sequence data read as words in a file that encoded and pass into RNN
The rNNTest2
main function tests the RNN
class on sequence data read as words in a file that encoded and pass into RNN
runMain scalation.modeling.forecasting.neuralforecasting.rNNTest2
Attributes
The rNNTest3
main function tests the RNN
class on sequence/time series data corresponding to the lake level dataset using multiple lags.
The rNNTest3
main function tests the RNN
class on sequence/time series data corresponding to the lake level dataset using multiple lags.
runMain scalation.modeling.forecasting.neuralforecasting.rNNTest3