:::
TWISC

Solution Structure Utilization for Efficient Optimization and Large-scale Machine Learning


  • 講者 : 李靜沛 博士
  • 日期 : 2022/08/15 (Mon.) 10:00~12:00
  • 地點 : 資創中心122演講廳、視訊
  • 邀請人 : 黃彥男
Abstract
https://asmeet.webex.com/asmeet/j.php?MTID=md9f85d6c71b614dc899772991e449bad
2022年08月15日 星期一 上午 10:00 | 2 小時 | (UTC+08:00)台北
會議號: 2517 680 6444
密碼: JVc5nSpM7y4

Regularized optimization that adds a regularizer to the objective function for minimization is widely used in numerous training problems in machine learning to induce desired structures in the output model. In this talk, I will first describe how to find such a structure in an approximate solution, without obtaining the optimal solution that is usually only the limit point of an iterative algorithm, in different scenarios and discuss why this matters. I will then show how we can utilize such an optimal structure to devise more efficient optimization/training algorithms. Examples of our algorithms designed for specific regularizers used in various tasks in machine learning and signal processing to either identify such optimal structures for better performance or utilize them to accelerate the optimization procedure will be discussed, including training structured neural network models, matrix completion, federated learning, and the best subset selection problem. This is joint work with Yu-Sheng Li, Wei-Lin Chiang, Zih-Syuan Huang, Jan Harold Alcantara, Ling Liang, Tianyun Tang, Kim-Chuan Toh.
Bio
LEE Ching-pei is an assistant research fellow in the Institute of Statistical Science, Academia Sinica. Prior to Academia Sinica, Ching-pei was a Peng Tsu Ann Assistant Professor in the Department of Mathematics & the Institute for Mathematical Sciences at the National University of Singapore. Before NUS, Ching-pei earned a Ph.D. and a M.S. degree in Computer Sciences (with a minor in mathematics) from the University of Wisconsin-Madison under the supervision of Prof. Stephen J. Wright. Prior to that, Ching-pei was a member of Prof. Chih-Jen Lin's group and involved in multiple projects related to open-source software development for large-scale machine learning problems. Ching-pei works on both theory and practice of nonlinear optimization and their application in large-scale machine learning problems, with a focus on scalable and practically efficient algorithms that also come with sound theoretical guarantees. Dr. Lee publishes regularly in top journals and conferences of both optimization and machine learning, including MPA, MPC, JMLR, ICML, ICLR, KDD. Ching-pei is also the winner of multiple awards, including a Young Scholar Award from the Foundation for the Advancement of Outstanding Scholarship in 2021 and the 2030 Cross-generation Emerging Young Scholars in 2022. Personal website: https://leepei.github.io/