allennlp.modules.masked_layer_norm

class allennlp.modules.masked_layer_norm.MaskedLayerNorm(size: int, gamma0: float = 0.1, eps: float = 1e-06) → None[source]

Bases: torch.nn.modules.module.Module

See LayerNorm for details.

Note, however, that unlike LayerNorm this norm includes a batch component.

forward(tensor: torch.Tensor, mask: torch.Tensor) → torch.Tensor[source]