Functions#
Layers without parameters (e.g. activation functions) are also provided as simple functions.
|
套用指數線性單元(ELU)。 |
|
套用連續可微的指數線性單元(CELU)。 |
|
套用高斯誤差線性單元(GELU)函式。 |
|
An approximation to Gaussian Error Linear Unit. |
高斯誤差線性單元的快速近似版本。 |
|
|
Applies the gated linear unit function. |
|
Applies the HardShrink activation function. |
|
Applies the HardTanh function. |
|
逐元素套用 hardswish 函式。 |
|
Applies the Leaky Rectified Linear Unit. |
|
Applies the Log Sigmoid function. |
|
Applies the Log Softmax function. |
|
Applies the Mish function, element-wise. |
|
Applies the element-wise parametric ReLU. |
|
Applies the Rectified Linear Unit. |
|
Applies the ReLU² activation function. |
|
Applies the Rectified Linear Unit 6. |
|
Applies the Scaled Exponential Linear Unit. |
|
Applies the sigmoid function. |
|
Applies the Sigmoid Linear Unit. |
|
Applies the Softmax function. |
|
Applies the Softmin function. |
|
Applies the Softplus function. |
|
Applies the Softshrink activation function. |
|
Applies the Step Activation Function. |
|
Applies the hyperbolic tangent function. |