A Highway layer that does a gated combination of a linear transformation and a non-linear transformation of its input.
Highway(input_dim: int, num_layers: int = 1, activation: typing.Callable[torch.FloatTensor, torch.FloatTensor] = <function relu>) → None¶
A Highway layer does a gated combination of a linear transformation and a non-linear transformation of its input. \(y = g * x + (1 - g) * f(A(x))\), where \(A\) is a linear transformation, \(f\) is an element-wise non-linearity, and \(g\) is an element-wise gate, computed as \(sigmoid(B(x))\).
This module will apply a fixed number of highway layers to its input, returning the final result.
The dimensionality of \(x\). We assume the input has shape
int, optional (default=``1``)
The number of highway layers to apply to the input.
Callable[[torch.Tensor], torch.Tensor], optional (default=``torch.nn.functional.relu``)
The non-linearity to use in the highway layers.
forward(inputs: torch.FloatTensor) → torch.FloatTensor¶
Defines the computation performed at every call.
Should be overriden by all subclasses.
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.