Functions

Functions#

Layers without parameters (e.g. activation functions) are also provided as simple functions.

elu(x[, alpha])

套用指數線性單元(ELU)。

celu(x[, alpha])

套用連續可微的指數線性單元(CELU)。

gelu(x)

套用高斯誤差線性單元(GELU)函式。

gelu_approx(x)

An approximation to Gaussian Error Linear Unit.

gelu_fast_approx(x)

高斯誤差線性單元的快速近似版本。

glu(x[, axis])

Applies the gated linear unit function.

hard_shrink(x[, lambd])

Applies the HardShrink activation function.

hard_tanh(x[, min_val, max_val])

Applies the HardTanh function.

hardswish(x)

逐元素套用 hardswish 函式。

leaky_relu(x[, negative_slope])

Applies the Leaky Rectified Linear Unit.

log_sigmoid(x)

Applies the Log Sigmoid function.

log_softmax(x[, axis])

Applies the Log Softmax function.

mish(x)

Applies the Mish function, element-wise.

prelu(x, alpha)

Applies the element-wise parametric ReLU.

relu(x)

Applies the Rectified Linear Unit.

relu2(x)

Applies the ReLU² activation function.

relu6(x)

Applies the Rectified Linear Unit 6.

selu(x)

Applies the Scaled Exponential Linear Unit.

sigmoid(x)

Applies the sigmoid function.

silu(x)

Applies the Sigmoid Linear Unit.

softmax(x[, axis])

Applies the Softmax function.

softmin(x[, axis])

Applies the Softmin function.

softplus(x)

Applies the Softplus function.

softshrink(x[, lambd])

Applies the Softshrink activation function.

step(x[, threshold])

Applies the Step Activation Function.

tanh(x)

Applies the hyperbolic tangent function.