` JETNET 3.0` implements the weight decay and the pruning method of
eqs. () and (). The Lagrange multipliers
correspond to parameters ` PARJN(5)` and ` PARJN(14)` respectively.
Hessian-based pruning can be done by computing the Hessian matrix,
its eigenvalues and eigenvectors: Small weights that lie inside
a flat subspace can be omitted.

Following [63], we tune in eq. () according to

- If or increment .
- If and and decrease .
- If and and
rescale , where
**c < 1**.

- The weighted average error: .
- The desired error, which acts as a threshold for the procedure.
Solutions with error above
**D**are not pruned unless the training error is decreasing. In ref. [63] it is advised for hard problems that**D**is set to random performance, which in practice means that pruning is always on.

In ` JETNET 3.0` these pruning parameters correspond to; ` PARJN(14)` for
, ` PARJN(15)` for , ` PARJN(16)` for ,
` PARJN(17)` for **c**, ` PARJN(18)` for , and
` PARJN(19)` for the desired error **D**. Of these, ` PARJN(15)`,
` PARJN(18)` and ` PARJN(19)` are crucial.

Fri Feb 24 11:28:59 MET 1995