Fast and Robust Online Inference with Stochastic Gradient Descent via Random Scaling
Sokbae (Simon) Lee,
Yuan Liao,
Myung Hwan Seo and
Youngki Shin
Papers from arXiv.org
Abstract:
We develop a new method of online inference for a vector of parameters estimated by the Polyak-Ruppert averaging procedure of stochastic gradient descent (SGD) algorithms. We leverage insights from time series regression in econometrics and construct asymptotically pivotal statistics via random scaling. Our approach is fully operational with online data and is rigorously underpinned by a functional central limit theorem. Our proposed inference method has a couple of key advantages over the existing methods. First, the test statistic is computed in an online fashion with only SGD iterates and the critical values can be obtained without any resampling methods, thereby allowing for efficient implementation suitable for massive online data. Second, there is no need to estimate the asymptotic variance and our inference method is shown to be robust to changes in the tuning parameters for SGD algorithms in simulation experiments with synthetic data.
Date: 2021-06, Revised 2021-10
New Economics Papers: this item is included in nep-cmp, nep-ecm and nep-ore
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Published in Proceedings of the 36th AAAI Conference on Artificial Intelligence, 36(7), 2022, pp. 7381-7389
Downloads: (external link)
http://arxiv.org/pdf/2106.03156 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2106.03156
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().