neural-classifier

General information

Typical neural network lifecycle is the following:

  1. Create a snakes generator which will return data for the training step in the form (data-object . label)(i.e. cons pair which contains a sample for training and a label which describes that sample).
  2. Create four functions: The first two functions translate a sample returned by the generator created in the previous step to magicl:matrix/single-float matrix with dimensions Nx1 which serves as an input into neural network. One of these functions is used for training and another for operation of a trained network. The function used for training may make additional random transformations of the input data to extend a training set. The third function translates a label to magicl:matrix/single-float matrix with dimensions Mx1 and the fourth translates a matrix with dimensions Mx1to a label. Obviously the third function is used for training and the fourth is used for classifing objects with trained net.
  3. Create a neural network with neural-classifier:make-neural-network. The first parameter, layout, must be a list (N ... M) which contains numbers of neurons in each layer of the network, where N and M are taken from the previous step. Also pass the functions created in the previous step as arguments to make-neural-network.
  4. Train a neural network calling neural-classifier:train-epoch. An epoch is finished when supplied generator doesn't have any more data (i.e. returns snakes:generator-stop). This function accepts an optimizer as an optional parameter. Visit optimizers section for more information about optimizers.
  5. Repeat the previous step to train for a higher number of epochs. You can use neural-classifier:rate function to control the accuracy of your network, so you know where to stop training.
  6. After the net is trained, call neural-classifier:calculate to pass your data through the net.

Abstact example: find a human face on photo.

(defun load-image (pathname)
  ;; Assume that our images are PNGs prescaled to 50x50 pixels
  (let ((image (opticl:coerce-image
                (opticl:read-png-file pathname)
                'opticl:gray-image))
        (matrix (magicl:empty '(#.(* 50 50) 1)
                              :type 'single-float)))
    ;; Convert simple array to magicl:matrix/single-float column
    (loop for i below (* 50 50) do
      (setf (magicl:tref matrix i 0)
            (/ (row-major-aref image i) 255.0)))
    matrix))

(defun classify (output)
  (declare (type magicl:matrix/single-float output))
  ;; When the first output value is less than the second one, classify
  ;; the picture as face
  (if (< (magicl:tref output 0 0)
         (magicl:tref output 1 0))
      :face :no-face))

(defun label-to-matrix (label)
  (declare (type (member :face :no-face) label))
  ;; Convert labels to matrices
  (ecase label
    (:face    (magicl:from-list '(0f0 1f0) '(2 1)))
    (:no-face (magicl:from-list '(1f0 0f0) '(2 1)))))

(defun make-network ()
  (neural-classifier:make-neural-network
   ;; One input layer with 50x50 neurons
   ;; One hidden layer with 100 neurons
   ;; One output layer with 2 neurons.
   '(#.(* 50 50) 100 2)
   ;; This is used for image loading in trained net
   :input-trans #'load-image
   ;; This is used for image loading while training
   :input-trans% (alexandria:compose #'add-noise #'random-rotate #'load-image)
   ;; This is used to produce a label
   :output-trans #'classify
   ;; Produce a matrix from a label
   :label-trans #'label-to-matrix
   ;; Activation fnctions: relu in the hidden layer, softmax in the
   ;; output layer.
   :activation-funcs (list (make-instance 'neural-classifier:leaky-relu :coeff 0.0)
                           (make-instance 'neural-classifier:softmax))))

(defun train-epochs (network n)
  "Train a network for n epochs"
  ;; Arrange our train data
  (let ((data (snakes:list->generator
               (mapcar
                (lambda (pathname)
                  (cons pathname
                        (if (face-p pathname)
                            :face :no-face)))
                *list-of-pictures*))))
    (loop repeat n
          ;; Train one epoch
          do (neural-classifier:train-epoch network data)
          ;; Collect accuracy of recognition.
          ;; You must use another set for validation data in real
          ;; use.
          collect (neural-classifier:rate network data))))

(defun classify-image (network pathname)
  ;; When network is trained, just pass your image to CALCULATE
  ;; to classify it.
  (neural-classifier:calculate network pathname))

If you want another example, look at mnist/mnist.lisp to see how digit recognition works.