neural-classifier

Activation functions

Activation functions are inserted after each layer of a neural network. These activation functions are supported:

sigmoid
Superclasses(hidden-layer-activation output-layer-activation t)
Metaclassstandard-class
Default Initargsnil

Sigmoid activation function: \(f(x) = \frac{1}{1 + \exp(-x)}\)

Has output in the range \([0, 1]\), so it's most suited for describing 'intensity' of some property.

    %tanh
    Superclasses(hidden-layer-activation output-layer-activation t)
    Metaclassstandard-class
    Default Initargsnil
    Hyberbolic tangent activation function. Has output in the range \([-1, 1]\), so it's a rescaled sigmoid. Neural networks which use tanh in place of sigmoid are believed to be more trainable.
      softmax
      Superclasses(output-layer-activation t)
      Metaclassstandard-class
      Default Initargsnil
      Softmax activation function: \(f(x_i) = \frac{\exp(x_i)}{\sum_i \exp(x_i)}\). It's output range is \([0, 1]\) and a sum of all elements in the output vector is 1.
        leaky-relu
        Superclasses(hidden-layer-activation t)
        Metaclassstandard-class
        Default Initargsnil
        Leaky ReLU activation function. It returns its argument when it is greater than zero or the argument multiplied by coeff otherwise. Usually this is an activation function of choice for hidden layers.
        • coeff
          Coefficient of leaky ReLU. A value of 0 means just an ordinary ReLU.
          Allocationinstance
          Typesingle-float
          Initarg:coeff
          Initform0.0
          Readers(leaky-relu-coeff)
        %identity
        Superclasses(output-layer-activation t)
        Metaclassstandard-class
        Default Initargsnil
        Identity activation function (just returns its input).

          Activation functions differ in how they can be associated with layers of a network. The division is as follows:

          activation
          Superclasses(t)
          Metaclassstandard-class
          Default Initargsnil
          Generic class for activation functions. Not to be instantiated.
            hidden-layer-activation
            Superclasses(activation t)
            Metaclassstandard-class
            Default Initargsnil
            Generic class for activation functions associated with hidden layers. Not to be instantiated.
              output-layer-activation
              Superclasses(activation t)
              Metaclassstandard-class
              Default Initargsnil
              Generic class for activation functions associated with an output layer. Not to be instantiated.