layers
EncoderRNN
EncoderRNN (d_z:int, enc_hidden_size:int, use_recurrent_dropout=False, r_dropout_prob=0.0, use_layer_norm=False, layer_norm_learnable=False, lstm_impl='builtin')
Encoder module
This consists of a bidirectional LSTM
BivariateGaussianMixture
BivariateGaussianMixture (pi_logits:torch.Tensor, mu_x:torch.Tensor, mu_y:torch.Tensor, sigma_x:torch.Tensor, sigma_y:torch.Tensor, rho_xy:torch.Tensor)
Bi-variate Gaussian mixture
The mixture is represented by \(\Pi\) and \(\mathcal{N}(\mu_{x}, \mu_{y}, \sigma_{x}, \sigma_{y}, ho_{xy})\). This class adjusts temperatures and creates the categorical and Gaussian distributions from the parameters.
DecoderRNN
DecoderRNN (d_z:int, dec_hidden_size:int, n_distributions:int, use_recurrent_dropout=False, r_dropout_prob=0.0, use_layer_norm=False, layer_norm_learnable=False, lstm_impl='builtin')
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self):
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to
, etc.
.. note:: As per the example above, an __init__()
call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool
ReconstructionLoss
ReconstructionLoss (*args, **kwargs)
Reconstruction Loss
KLDivLoss
KLDivLoss (*args, **kwargs)
This calculates the KL divergence between a given normal distribution and \(\mathcal{N}(0, 1)\)