Title
SVR-Primal Dual Method of Multipliers (PDMM) for Large-Scale Problems
Abstract
With the advent of big data scenarios, centralized processing is no more feasible and is on the verge of getting obsolete. With this shift in paradigm, distributed processing is becoming more relevant, i.e., instead of burdening the central processor, sharing the load between the multiple processing units. The decentralization capability of the ADMM algorithm made it popular since the recent past. Another recent algorithm PDMM paved its way for distributed processing, which is still in its development state. Both the algorithms work well with the medium-scale problems, but dealing with large scale problems is still a challenging task. This work is an effort towards handling large scale data with reduced computation load. To this end, the proposed framework tries to combine the advantages of the SVRG and PDMM algorithms. The algorithm is proved to converge with rate O(1/K for strongly convex loss functions, which is faster than the existing algorithms. Experimental evaluations on the real data prove the efficacy of the proposed algorithm over the state of the art methodologies.
Year
DOI
Venue
2020
10.1109/NCC48643.2020.9056014
2020 National Conference on Communications (NCC)
Keywords
DocType
ISBN
distributed optimization,PDMM,ADMM,SVRG
Conference
978-1-7281-5121-2
Citations 
PageRank 
References 
0
0.34
7
Authors
3
Name
Order
Citations
PageRank
Lijanshu Sinha100.34
Ketan Rajawat212425.44
Chirag Kumar300.34