SimilarityFunction takes a pair of tensors with the same shape, and computes a similarity
function on the vectors in the last dimension. For example, the tensors might both have shape
(batch_size, sentence_length, embedding_dim), and we will compute some function of the two
vectors of length
embedding_dim for each position
(batch_size, sentence_length), returning a
tensor of shape
The similarity function could be as simple as a dot product, or it could be a more complex, parameterized function.
If you want to compute a similarity between tensors of different sizes, you need to first tile
them in the appropriate dimensions to make them the same before you can use these functions.
~allennlp.modules.Attention and :class:
str(object='') -> str str(bytes_or_buffer[, encoding[, errors]]) -> str
Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object.str() (if defined) or repr(object). encoding defaults to sys.getdefaultencoding(). errors defaults to 'strict'.
SimilarityFunction.forward(self, tensor_1:torch.Tensor, tensor_2:torch.Tensor) -> torch.Tensor
Takes two tensors of the same shape, such as
(batch_size, length_1, length_2,
embedding_dim). Computes a (possibly parameterized) similarity on the final dimension
and returns a tensor with one less dimension, such as
(batch_size, length_1, length_2).