:::

TIGP (AIoT) Seminar -- Decoupling Deep Learning: Enhancing Training Efficiency with Localized Gradient Methods


  • 講者 : 陳弘軒 教授
  • 日期 : 2025/03/07 (Fri.) 14:00~16:00
  • 地點 : 資創中心122演講廳
  • 邀請人 : TIGP (AIoT)
Abstract
Backpropagation (BP) is fundamental to deep learning but suffers from inefficiencies like vanishing gradients and backward locking. This talk explores localized gradient methods, including Associated Learning (AL), Supervised Contrastive Parallel Learning (SCPL), and Decoupled Supervised Learning with Information Regularization (DeInfoReg), which decouple BP into independent local objectives. These approaches improve training efficiency, robustness, and generalization. We discuss their theoretical foundations, practical applications, and experimental results, highlighting their potential as scalable alternatives to traditional BP.