public class EncoderTrainingFactory
This benchmark implements a Fahlman Encoder. Though probably not invented by Scott
Fahlman, such encoders were used in many of his papers, particularly:
"An Empirical Study of Learning Speed in Backpropagation Networks"
It provides a very simple way of evaluating classification neural networks.
Basically, the input and output neurons are the same in count. However,
there is a smaller number of hidden neurons. This forces the neural
network to learn to encode the patterns from the input neurons to a
smaller vector size, only to be expanded again to the outputs.
The training data is exactly the size of the input/output neuron count.
Each training element will have a single column set to 1 and all other
columns set to zero. You can also perform in "complement mode", where
the opposite is true. In "complement mode" all columns are set to 1,
except for one column that is 0. The data produced in "complement mode"
is more difficult to train.
Fahlman used this simple training data to benchmark neural networks when
he introduced the Quickprop algorithm in the above paper.