Activation functions
Activation functions are inserted after each layer of a neural network. These activation functions are supported:
sigmoid| Superclasses | (hidden-layer-activation output-layer-activation t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Sigmoid activation function: \(f(x) = \frac{1}{1 + \exp(-x)}\)
Has output in the range \([0, 1]\), so it's most suited for describing 'intensity' of some property.
%tanh| Superclasses | (hidden-layer-activation output-layer-activation t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Hyberbolic tangent activation function. Has output
in the range \([-1, 1]\), so it's a rescaled sigmoid. Neural
networks which use tanh in place of sigmoid are believed to be more
trainable.
softmax| Superclasses | (output-layer-activation t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Softmax activation function: \(f(x_i) =
\frac{\exp(x_i)}{\sum_i \exp(x_i)}\).
It's output range is \([0, 1]\) and a sum of all elements in the
output vector is 1.
leaky-relu| Superclasses | (hidden-layer-activation t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Leaky ReLU activation function. It returns its
argument when it is greater than zero or the argument multiplied by
coeff otherwise. Usually this is an activation function of choice
for hidden layers.coeffCoefficient of leaky ReLU. A value of 0 means just an ordinary ReLU.Allocation instance Type single-floatInitarg :coeffInitform 0.0Readers (leaky-relu-coeff)
%identity| Superclasses | (output-layer-activation t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Identity activation function (just returns its input).
Activation functions differ in how they can be associated with layers of a network. The division is as follows:
activation| Superclasses | (t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Generic class for activation functions. Not to be
instantiated.
hidden-layer-activation| Superclasses | (activation t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Generic class for activation functions associated
with hidden layers. Not to be instantiated.
output-layer-activation| Superclasses | (activation t) |
| Metaclass | standard-class |
| Default Initargs | nil |
Generic class for activation functions associated
with an output layer. Not to be instantiated.