This class implements a layer that calculates the ELU
activation function for each element of a single input.
The activation function formula:
f(x) = GetAlpha() * (exp(x) - 1) if x < 0
f(x) = x if x >= 0
void SetAlpha( float alpha );
Sets the multiplier before the exponential function used for negative values of x
.
There are no trainable parameters for this layer.
There is only one input accepting a data blob of arbitrary size.
There is only one output, which contains a blob of the same size as the input; each element of the output is the value of the activation function for the corresponding element of the input.