The Alternating Direction Method of Multipliers (ADMM) has been widely used in optimization due to its efficiency and scalability to large-scale problems arising from machine learning, signal processing, computer vision, etc.
The focus of this workshop will be on recent developments in ADMM, with an emphasis on how to improve ADMM in both theory and practice by exploiting structures in nonconvex and/or large scale problems.
One primary topic of the workshop is the use of new randomization technique in multi-block ADMM to improve the performance for quadratic optimization. The workshop will also consider new theoretical complexity results of ADMM on structured nonconvex problems and broad use of ADMM for large scale machine learning problems. Another goal of this workshop is to bridge the theory-practice gap by focusing on efficient implementation of ADMM solvers.