class |
ActivationCube |
f(x) = x^3
|
class |
ActivationELU |
|
class |
ActivationGELU |
|
class |
ActivationHardSigmoid |
f(x) = min(1, max(0, 0.2*x + 0.5))
|
class |
ActivationHardTanH |
⎧ 1, if x > 1
f(x) = ⎨ -1, if x < -1
⎩ x, otherwise
|
class |
ActivationIdentity |
f(x) = x
|
class |
ActivationLReLU |
Leaky RELU
f(x) = max(0, x) + alpha * min(0, x)
alpha defaults to 0.01
|
class |
ActivationMish |
https://arxiv.org/ftp/arxiv/papers/1908/1908.08681.pdf
|
class |
ActivationPReLU |
|
class |
ActivationRationalTanh |
|
class |
ActivationRectifiedTanh |
|
class |
ActivationReLU |
f(x) = max(0, x)
|
class |
ActivationReLU6 |
|
class |
ActivationRReLU |
|
class |
ActivationSELU |
https://arxiv.org/pdf/1706.02515.pdf
|
class |
ActivationSigmoid |
f(x) = 1 / (1 + exp(-x))
|
class |
ActivationSoftmax |
f_i(x) = exp(x_i - shift) / sum_j exp(x_j - shift)
where shift = max_i(x_i)
|
class |
ActivationSoftPlus |
f(x) = log(1+e^x)
|
class |
ActivationSoftSign |
f_i(x) = x_i / (1+|x_i|)
|
class |
ActivationSwish |
f(x) = x * sigmoid(x)
|
class |
ActivationTanH |
f(x) = (exp(x) - exp(-x)) / (exp(x) + exp(-x))
|
class |
ActivationThresholdedReLU |
Thresholded RELU
f(x) = x for x > theta, f(x) = 0 otherwise.
|