next up previous
Next: Receptive Fields and Up: Practical Implementation Issues Previous: Pruning

Computing the Hessian

A novelty in JETNET 3.0 is the possibility to compute the Hessian matrix for an MLP network by invoking the subroutine JNHESS. The computation is done much in the same way as the training; training patterns are iteratively placed in the vectors OIN and OUT before JNHESS is called. When one full epoch, the size of which is controlled by MSTJN(2) and MSTJN(9), has been presented, it normalizes and symmetrizes the Hessian and places it in the internal common block /JNINT5/.

If the Hessian has been symmetrized, the eigenvectors and eigenvalues of the Hessian can be computed by invoking JNHEIG (single precision). Eigenvalues are then placed in the vector OUT and the eigenvectors replace the columns of the Hessian matrix. The Hessian can be printed out by invoking the subroutine JNSTAT. However, anticipating possible questions, the terms of the Hessian matrix are ordered in JETNET 3.0 according to (cf. fig. gif and eq. (gif))

 

with obvious interpretation and extension to more layers.

JNHESS assumes the summed square error in eq. (gif).



System PRIVILEGED Account
Fri Feb 24 11:28:59 MET 1995