Speaker: Dr. Yahong Yang(杨雅鸿博士)

Time: 14:00-15:00, 19 May 2025 (Monday) (Beijing time)

Venue: C406, Lijiao Building


Abstract

Solving partial differential equations (PDEs) using neural networks has become a famous topic in scientific machine learning. However, training neural networks remains challenging due to the highly complex and non-convex energy landscapes of the associated loss functions. These difficulties are further amplified in sharp interface problems, where certain parameters in the PDEs introduce near-singularities in the loss. In this talk, I will present a novel training framework based on homotopy dynamics to address these challenges. Specifically, I will introduce two homotopy strategies: the first performs homotopy in the activation functions by gradually transforming from simpler to the original nonlinearities; the second applies homotopy in the PDE parameters to manage the singular behavior in sharp interface regimes. Both approaches demonstrate improved training stability and enhanced accuracy in capturing sharp interfaces when solving PDEs with neural networks.


About the Speaker

Dr. Yahong Yang received his Ph.D. in Mathematics from the Hong Kong University of Science and Technology in 2023 and is currently a postdoctoral researcher at Pennsylvania State University. His research interests include machine learning theory, mathematical modeling in materials science and biology, and numerical methods for solving partial differential equations. Dr. Yang has authored several first-author papers in leading journals such as the SIAM Journal on Multiscale Modeling and Simulation and the Journal of Scientific Computing, and has presented his work at top conferences including NeurIPS and ICML. His current work focuses on advancing deep learning theory and applying machine learning techniques to complex PDEs, with applications in models such as the Allen–Cahn equation, the Gray–Scott model, and Green’s functions.