The artificial neural network is a parallel computer architecture invented three decades ago as an alternative to the serial, von Neumann machine. Neural networks do not lend themselves to implementation by traditional semiconductor hardware (resistors are required) nor to fast simulation by standard serial methods (multiplication operations dominate). Digital signal processors execute multiplications in hardware and can accelerate neural network simulations. Neural network simulations can be efficiently mapped onto an array of digital signal processors for further acceleration. The acceleration possible with one particular digital signal processor is presented, and a possible system design for a neural network workstation is outlined. The approach is consistent with standard semiconductor technology and can be expected to follow traditional semiconductor cost/functionality learning curves.