Published: Oct. 12, 2021

Samy Wu Fung, Department of Applied Mathematics and Statistics, Colorado School of Mines

Efficient Training of Infinite-depth Neural Networks via Jacobian-free Backpropagation

A promising trend in deep learning replaces fixed depth models by approximations of the limit as network depth approaches infinity. This approach uses a portion of network weights to prescribe behavior by defining a limit condition. This makes network depth implicit, varying based on the provided data and an error tolerance. Moreover, existing implicit models can be implemented and trained with fixed memory costs in exchange for additional computational costs. In particular, backpropagation through implicit depth models requires solving a Jacobian-based equation arising from the implicit function theorem. We propose a new Jacobian-free backpropagation (JFB) scheme that circumvents the need to solve Jacobian-based equations while maintaining fixed memory costs. This makes implicit depth models much cheaper to train and easy to implement. Numerical experiments show JFB is more computationally efficient while maintaining competitive accuracy for classification tasks.