Consideration is given to the piecewise linear neural network (PLNN), a neural network form of a piecewise linear classifier. The weight vectors of the cells in the middle layer of the network represent prototype patterns of the various classes to be recognized. The PLNN classifies input patterns by the minimum Euclidean distance to the prototype cells. The network structure is defined during training. The training is controlled by parameters ε and δ. If there is no cell of the correct class within distance ε of a training pattern, a new prototype cell is created. And all prototype cells within distance δ of the training pattern have their weights adjusted, either towards or away from the training pattern, depending upon their class. Cells of the same class that are moved close together are merged. The dynamics of the PLNN training is shown by examples of the prototype cells adapting to statistically generated data. The results of training backpropagation networks of similar size are described. The PLNN is also trained for the recognition of handwritten numerals. The processing of the image data, training of the network, and recognition results for PLNNs of various sizes are described. The results of training backpropagation networks of comparable size on the same data are also shown. The PLNN performance is generally superior, with far less training time required