API documentation
Neural network class and accessors
Functions
neural-network
Option | Value |
Superclasses: | (t) |
Metaclass: | standard-class |
Default Initargs: | nil |
Class for neural networks
layout
Number of neurons in each layer of the networkOption Value Allocation: instance Type: list
Initarg: :layout
Initform: (error "Specify number of neurons in each layer")
Readers: (neural-network-layout)
activation-funcs
List of activation functions.Option Value Allocation: instance Type: list
Initarg: :activation-funcs
Initform: nil
Accessors: (neural-network-activation-funcs)
weights
Weight matrices for each layerOption Value Allocation: instance Type: list
Accessors: (neural-network-weights)
biases
Bias vectors for each layerOption Value Allocation: instance Type: list
Accessors: (neural-network-biases)
input-trans
Function which translates an input object to a vectorOption Value Allocation: instance Type: function
Initarg: :input-trans
Initform: (function identity)
Accessors: (neural-network-input-trans)
output-trans
Function which translates an output vector to a label.Option Value Allocation: instance Type: function
Initarg: :output-trans
Initform: (function identity)
Accessors: (neural-network-output-trans)
input-trans%
Function which translates an input object to a vector (used for training)Option Value Allocation: instance Type: function
Initarg: :input-trans%
Initform: (function identity)
Accessors: (neural-network-input-trans%)
label-trans
Function which translates a label to a vectorOption Value Allocation: instance Type: function
Initarg: :label-trans
Initform: (function identity)
Accessors: (neural-network-label-trans)
make-neural-network
(layout &key input-trans output-trans input-trans% label-trans activation-funcs)
Create a new neural network.
layout
is a list of positive integers which describes a number of neurons in each layer (starting from input layer).activation-funcs
is a list all the elements of which are objects of typeactivation
. The length of this list must be equal to the length oflayout
minus one because the input layer does not have an activation function. The last element must be of typeoutput-layer-activation
and the all elements but last must be of typehidden-layer-activation
.input-trans
is a function which is applied to an object passed tocalculate
to transform it into an input column (that is a matrix with the typemagicl:matrix/single-float
and the shapeNx1
, whereN
is the first number in thelayout
). For example, if we are recognizing digits from the MNIST set, this function can take a number of an image in the set and return784x1
matrix.output-trans
is a function which is applied to the output ofcalculate
function (that is a matrix with the typemagicl:matrix/single-float
and the shape Mx1, where M is the last number in thelayout
) to return some object with user-defined meaning (called a label). Again, if we are recognizing digits, this function transforms10x1
matrix to a number from 0 to 9.input-trans%
is just likeinput-trans
, but is used while training. It can include additional transformations to extend your training set (e.g. it can add some noise to input data, rotate an input picture by a small random angle, etc.).label-trans
is a function which is applied to a label to get a column (that is a matrix with the typemagicl:matrix/single-float
and the shapeMx1
, whereM
is the last number in thelayout
) which is the optimal output from the network for this object. With digits recognition, this function may take a digitn
and return10x1
matrix of all zeros with exception forn
-th element which would be1f0
.
identity
.calculate
(neural-network object)
Calculate output from the network
neural-network
for the object
object
. The input transformation function (specified by
:input-trans
when creating a network) is applied to the object
and the output transformation function (specified by
:output-trans
) is applied to output Nx1 matrix from the network.train-epoch
(neural-network generator &key (optimizer (make-instance (quote sgd-optimizer))))
Perform training of
neural-network
on every object returned
by the generator generator
. Each item returned by generator
must be in the form (data-object . label)
cons
pair. input-trans%
and label-trans
functions passes to
make-neural-network
are applied to car
and cdr
of each
pair respectively.rate
(neural-network generator &key (test (function eql)))
Calculate accuracy of the
neural-network
(ratio of correctly
guessed samples to all samples) using testing data from the generator
generator
. Each item returned by generator
must be a cons pair
in the form (data-object . label)
, as with train-epoch
function. test
is a function used to compare the expected label
with the label returned by the network.idx-abs-max
(matrix)
Returns index of first element with maximal absolute value by
calling isamax() function from BLAS. Works only for rows or
columns.