next up previous
Next: Initial weight values Up: Practical Implementation Issues Previous: Practical Implementation Issues


JETNET 3.0 is initialized by calling the subroutine JNINIT. It allows for switching between different learning rates at convenience during execution, but each learning algorithm uses specific parameters that need to be initialized. The default values of these parameters give good results in most cases.

The ANN architecture (number of hidden layers, nodes, etc.) is designed through the switches MSTJN(1) and MSTJN(10-19). The distribution of the initial weights is set by the parameter PARJN(4). Naturally, these switches and parameters must be set prior to calling JNINIT.

System PRIVILEGED Account
Fri Feb 24 11:28:59 MET 1995