Activation is just a function
that takes some parameters and returns an element-wise activation function.
For the most part we just use
Here we provide a thin wrapper to allow registering them and instantiating them
The available activation functions are
Pytorch has a number of built-in activation functions. We group those here under a common type, just to make it easier to configure and instantiate them
Note that we’re only including element-wise activation functions in this list. You really need to think about masking when you do a softmax or other similar activation function, so it requires a different API.