Neural networks, trained by backpropagation, are designed and described in the language J, an APL derivative with powerful function encapsulation features. Both the languages J 4,6,7 and APL 5 help to identify and isolate the parallelism that is inherent in network training algorithms. Non-critical details of data input and derived output processes are de-emphasized by relegating those functions to callable stand-alone modules. Such input and output modules can be isolated and customized individually for managing communication with arbitrary, external storage systems. The central objective of this research is the design and precise description of a neural network training kernel. Such kernel designs are valuable for producing efficient reusable computer codes and facilitating the transfer of neural network technology from developers to users.
End of preview. The entire article is 8 pages. Rent for Free