Activation functions
Activation functions are inserted after each layer of a neural network. These activation functions are supported:
sigmoid
Option | Value |
Superclasses: | (hidden-layer-activation output-layer-activation t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Sigmoid activation function: \(f(x) = \frac{1}{1 + \exp(-x)}\)
Has output in the range \([0, 1]\), so it's most suited for describing 'intensity' of some property.
%tanh
Option | Value |
Superclasses: | (hidden-layer-activation output-layer-activation t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Hyberbolic tangent activation function. Has output
in the range \([-1, 1]\), so it's a rescaled sigmoid. Neural
networks which use tanh in place of sigmoid are believed to be more
trainable.
softmax
Option | Value |
Superclasses: | (output-layer-activation t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Softmax activation function: \(f(x_i) =
\frac{\exp(x_i)}{\sum_i \exp(x_i)}\).
It's output range is \([0, 1]\) and a sum of all elements in the
output vector is 1.
leaky-relu
Option | Value |
Superclasses: | (hidden-layer-activation t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Leaky ReLU activation function. It returns its
argument when it is greater than zero or the argument multiplied by
coeff
otherwise. Usually this is an activation function of choice
for hidden layers.coeff
Coefficient of leaky ReLU. A value of 0 means just an ordinary ReLU.Option Value Allocation: instance Type: single-float
Initarg: :coeff
Initform: 0.0
Readers: (leaky-relu-coeff)
%identity
Option | Value |
Superclasses: | (output-layer-activation t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Identity activation function (just returns its input).
Activation functions differ in how they can be associated with layers of a network. The division is as follows:
activation
Option | Value |
Superclasses: | (t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Generic class for activation functions. Not to be
instantiated.
hidden-layer-activation
Option | Value |
Superclasses: | (activation t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Generic class for activation functions associated
with hidden layers. Not to be instantiated.
output-layer-activation
Option | Value |
Superclasses: | (activation t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Generic class for activation functions associated
with an output layer. Not to be instantiated.