Show simple item record

dc.contributor.authorYu, T.
dc.contributor.authorLiu, X.W.
dc.contributor.authorDai, Y.H.
dc.contributor.authorSun, Jie
dc.date.accessioned2023-04-16T09:30:06Z
dc.date.available2023-04-16T09:30:06Z
dc.date.issued2021
dc.identifier.citationYu, T. and Liu, X.W. and Dai, Y.H. and Sun, J. 2021. A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes. IEEE Transactions on Neural Networks and Learning Systems. 32 (10): pp. 4627-4638.
dc.identifier.urihttp://hdl.handle.net/20.500.11937/91427
dc.identifier.doi10.1109/TNNLS.2020.3025383
dc.description.abstract

We consider the problem of minimizing the sum of an average of a large number of smooth convex component functions and a possibly nonsmooth convex function that admits a simple proximal mapping. This class of problems arises frequently in machine learning, known as regularized empirical risk minimization (ERM). In this article, we propose mSRGTR-BB, a minibatch proximal stochastic recursive gradient algorithm, which employs a trust-region-like scheme to select stepsizes that are automatically computed by the Barzilai-Borwein method. We prove that mSRGTR-BB converges linearly in expectation for strongly and nonstrongly convex objective functions. With proper parameters, mSRGTR-BB enjoys a faster convergence rate than the state-of-the-art minibatch proximal variant of the semistochastic gradient method (mS2GD). Numerical experiments on standard data sets show that the performance of mSRGTR-BB is comparable to and sometimes even better than mS2GD with best-tuned stepsizes and is superior to some modern proximal stochastic gradient methods.

dc.languageEnglish
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
dc.subjectScience & Technology
dc.subjectTechnology
dc.subjectComputer Science, Artificial Intelligence
dc.subjectComputer Science, Hardware & Architecture
dc.subjectComputer Science, Theory & Methods
dc.subjectEngineering, Electrical & Electronic
dc.subjectComputer Science
dc.subjectEngineering
dc.subjectConvergence
dc.subjectConvex functions
dc.subjectRisk management
dc.subjectGradient methods
dc.subjectLearning systems
dc.subjectSun
dc.subjectBarzilai-Borwein (BB) method
dc.subjectempirical risk minimization (ERM)
dc.subjectproximal method
dc.subjectstochastic gradient
dc.subjecttrust-region
dc.subjectMACHINE
dc.titleA Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes
dc.typeJournal Article
dcterms.source.volume32
dcterms.source.number10
dcterms.source.startPage4627
dcterms.source.endPage4638
dcterms.source.issn2162-237X
dcterms.source.titleIEEE Transactions on Neural Networks and Learning Systems
dc.date.updated2023-04-16T09:30:05Z
curtin.departmentSchool of Elec Eng, Comp and Math Sci (EECMS)
curtin.accessStatusFulltext not available
curtin.facultyFaculty of Science and Engineering
curtin.contributor.orcidSun, Jie [0000-0001-5611-1672]
curtin.contributor.researcheridSun, Jie [B-7926-2016] [G-3522-2010]
dcterms.source.eissn2162-2388
curtin.contributor.scopusauthoridSun, Jie [16312754600] [57190212842]
curtin.repositoryagreementV3


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record