[go: up one dir, main page]

Complex Systems

Replicability of Neural Computing Experiments Download PDF

D. Partridge
W. B. Yates
Department of Computer Science,
University of Exeter,
Exeter EX4 4PT, UK

Abstract

The nature of iterative learning on a randomized initial architecture, such as backpropagation training of a multilayer perceptron, is such that precise replication of a reported result is virtually impossible. The outcome is that experimental replication of reported results, a touchstone of "the scientific method,'' is not an option for researchers in this most popular subfield of neural computing. This paper addresses the issue of replicability of experiments based on backpropagation training of multilayer perceptrons (although many of the results are applicable to any other subfield that is plagued by the same characteristics) and demonstrate its complexity. First, an attempt to produce a complete abstract specification of such a neural computing experiment is made. From this specification an attempt to identify the full range of parameters needed to support maximum replicability is made and it is used to show why absolute replicability is not an option in practice. A statistical framework is proposed to support replicability measurement. This framework is demonstrated with some empirical studies on both replicability with respect to experimental controls, and validity of implementations of the backpropagation algorithm. Finally, the results are used to illustrate the difficulties associated with the issue of experimental replication and the claimed precision of results.