allennlp.modules.matrix_attention.dot_product_matrix_attention#

DotProductMatrixAttention#

DotProductMatrixAttention(self)

Computes attention between every entry in matrix_1 with every entry in matrix_2 using a dot product.