[go: up one dir, main page]

An Evaluation of Fisher Approximations Beyond Kronecker FactorizationDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: We study two coarser approximations on top of a Kronecker factorization (K-FAC) of the Fisher information matrix, to scale up Natural Gradient to deep and wide Convolutional Neural Networks (CNNs). The first considers the activations (feature maps) as spatially uncorrelated while the second considers only correlations among groups of channels. Both variants yield a further block-diagonal approximation tailored for CNNs, which is much more efficient to compute and invert. Experiments on the VGG11 and ResNet50 architectures show the technique can substantially speed up both K-FAC and a baseline with Batch Normalization in wall-clock time, yielding faster convergence to similar or better generalization error.
Keywords: convolutional neural networks, optimization, natural gradient, kronecker factorization
TL;DR: We study two coarser approximations on top of a Kronecker factorization of the Fisher information matrix, to scale up Natural Gradient to deep and wide Convolutional Neural Networks
3 Replies

Loading