Title of Program: JETNET version 3.0
Program obtainable from: email@example.com or via anonymous ftp from thep.lu.se in directory pub/Jetnet/ or from freehep.scri.fsu.edu in directory freehep/analysis/jetnet.
Computer for which the programme is designed: DEC Alpha, DECstation,
SUN, Apollo, VAX, IBM, Hewlett-Packard, and others with a F77 compiler
Computer: DEC Alpha 3000; installation: Department of
Theoretical Physics, University of Lund, Lund, Sweden
Operating system: DEC OSF 1.3
Program language used: FORTRAN 77
High speed storage required: k words
No. of bits in a word: 32
Peripherals used: terminal for input, terminal or printer
No. of lines in combined program and test deck: 5753
CPC Library subroutines used: none
Keywords: pattern recognition, jet identification,
data analysis, artificial neural network
Nature of physical problem
Challenging pattern recognition and non-linear modeling problems within high energy physics, ranging from off-line and on-line parton (or other constituent) identification tasks to accelerator beam control. Standard methods for such problems are typically confined to linear dependencies like Fischer discriminants, principal components analysis and ARMA models.
Method of solution
Artificial Neural Networks ( ANN) constitute powerful nonlinear extensions of the conventional methods. In particular feed-forward multilayer perceptron ( MLP) networks are widely used due to their simplicity and excellent performance. The F77 package JETNET 2.0  implemented ``vanilla'' versions of such networks using the back-propagation updating rule, and included a self-organizing map algorithm as well. The present version, JETNET 3.0, is backwards compatible with older versions and contains a number of powerful elaborate options for updating and analyzing MLP networks. A set of rules-of-thumb on when, why and how to use the various options is presented in this manual and the relation between the underlying algorithms and standard statistical methods is pointed out. The self-organizing part is unchanged and is hence not described here. The JETNET 3.0 package consists of a number of subroutines, most of which handle training and test data, that must be loaded with a main application specific program supplied by the user. Even though the package was originally mainly intended for jet triggering applications [2,3,4], where it has been used with success for heavy quark tagging and quark-gluon separation, it is of general nature and can be used for any pattern recognition problem area.
Restriction of complexity of the problem
The only restriction of the complexity for an application is set by available memory and CPU time. For a problem that is encoded with input nodes, output (feature) nodes, H layers of hidden nodes with nodes in each layer, the program requires the storage of real numbers given by
Also, the neurons requires the storage of real numbers according to
In addition one of course needs to at least temporarily store the patterns; real numbers.
If second order methods are employed, which keep track of past gradients, the storage requirement increases with . If individual learning rates are used, it increases with an additional .
Typical running time
Running the test-deck problem, which has and , for 100 epochs with 5000 training pattern presentations per epoch takes between 30 and 60 CPU-seconds on a DEC Alpha workstation 3000/400, depending on which method that is used. A real-world problem with and , using 3770 patterns and training for 1000 epochs, takes 565 CPU-seconds on the same machine.