A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes
dc.contributor.author | Yu, T. | |
dc.contributor.author | Liu, X.W. | |
dc.contributor.author | Dai, Y.H. | |
dc.contributor.author | Sun, Jie | |
dc.date.accessioned | 2023-04-16T09:30:06Z | |
dc.date.available | 2023-04-16T09:30:06Z | |
dc.date.issued | 2021 | |
dc.identifier.citation | Yu, T. and Liu, X.W. and Dai, Y.H. and Sun, J. 2021. A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes. IEEE Transactions on Neural Networks and Learning Systems. 32 (10): pp. 4627-4638. | |
dc.identifier.uri | http://hdl.handle.net/20.500.11937/91427 | |
dc.identifier.doi | 10.1109/TNNLS.2020.3025383 | |
dc.description.abstract |
We consider the problem of minimizing the sum of an average of a large number of smooth convex component functions and a possibly nonsmooth convex function that admits a simple proximal mapping. This class of problems arises frequently in machine learning, known as regularized empirical risk minimization (ERM). In this article, we propose mSRGTR-BB, a minibatch proximal stochastic recursive gradient algorithm, which employs a trust-region-like scheme to select stepsizes that are automatically computed by the Barzilai-Borwein method. We prove that mSRGTR-BB converges linearly in expectation for strongly and nonstrongly convex objective functions. With proper parameters, mSRGTR-BB enjoys a faster convergence rate than the state-of-the-art minibatch proximal variant of the semistochastic gradient method (mS2GD). Numerical experiments on standard data sets show that the performance of mSRGTR-BB is comparable to and sometimes even better than mS2GD with best-tuned stepsizes and is superior to some modern proximal stochastic gradient methods. | |
dc.language | English | |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | |
dc.subject | Science & Technology | |
dc.subject | Technology | |
dc.subject | Computer Science, Artificial Intelligence | |
dc.subject | Computer Science, Hardware & Architecture | |
dc.subject | Computer Science, Theory & Methods | |
dc.subject | Engineering, Electrical & Electronic | |
dc.subject | Computer Science | |
dc.subject | Engineering | |
dc.subject | Convergence | |
dc.subject | Convex functions | |
dc.subject | Risk management | |
dc.subject | Gradient methods | |
dc.subject | Learning systems | |
dc.subject | Sun | |
dc.subject | Barzilai-Borwein (BB) method | |
dc.subject | empirical risk minimization (ERM) | |
dc.subject | proximal method | |
dc.subject | stochastic gradient | |
dc.subject | trust-region | |
dc.subject | MACHINE | |
dc.title | A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes | |
dc.type | Journal Article | |
dcterms.source.volume | 32 | |
dcterms.source.number | 10 | |
dcterms.source.startPage | 4627 | |
dcterms.source.endPage | 4638 | |
dcterms.source.issn | 2162-237X | |
dcterms.source.title | IEEE Transactions on Neural Networks and Learning Systems | |
dc.date.updated | 2023-04-16T09:30:05Z | |
curtin.department | School of Elec Eng, Comp and Math Sci (EECMS) | |
curtin.accessStatus | Fulltext not available | |
curtin.faculty | Faculty of Science and Engineering | |
curtin.contributor.orcid | Sun, Jie [0000-0001-5611-1672] | |
curtin.contributor.researcherid | Sun, Jie [B-7926-2016] [G-3522-2010] | |
dcterms.source.eissn | 2162-2388 | |
curtin.contributor.scopusauthorid | Sun, Jie [16312754600] [57190212842] | |
curtin.repositoryagreement | V3 |