School Colloquium——Transferable Neural Networks for Partial Differential Equations
报告人:鞠立力 ( University of South Carolina)
时间:2024-05-24 14:00-15:00
地点:智华楼四元厅
报告摘要:Transfer learning for partial differential equations (PDEs) is to develop a pre-trained neural network that can be used to solve a wide class of PDEs. Existing transfer learning approaches require much information about the target PDEs such as its formulation and/or data of its solution for pre-training. In this work, we propose to design transferable neural feature spaces for the shallow neural networks from purely function approximation perspectives without using PDE information. The construction of the feature space involves the re-parameterization of the hidden neurons and uses auxiliary functions to tune the resulting feature space. Theoretical analysis shows the high quality of the produced feature space, i.e., uniformly distributed neurons. We use the proposed feature space as the predetermined feature space of a random feature model, and use existing least squares solvers to obtain the weights of the output layer. Extensive numerical experiments verify the outstanding performance of our method, including significantly improved transferability, e.g., using the same feature space for various PDEs with different domains and boundary conditions, and the superior accuracy, e.g., several orders of magnitude smaller mean squared error than the state of the art methods.
个人简介:鞠立力教授,1995年毕业于武汉大学数学系获数学学士学位,1998年在中国科学院计算数学与科学工程计算研究所获得计算数学硕士学位,2002年在美国爱荷华州立大学获得应用数学博士学位。2002-2004年在美国明尼苏达大学数学与应用研究所从事博士后研究。随后进入美国南卡罗莱纳大学工作,历任数学系助理教授(2004-2008),副教授(2008-2012)和教授(2013-现在)。主要从事偏微分方程数值方法与分析,非局部模型与计算,深度学习方法,计算机视觉,高性能科学计算及其在材料与地球科学中的应用等方面的研究工作。至今已发表科研论文150多篇,Google学术引用约6200多次。自2006年起已连续主持了十多项由美国国家科学基金会和能源部资助的科研项目。2012至2017年任SIAM J. Numer. Anal.的副主编,目前担任Math. Comp.,J. Sci. Comput.,Numer. Meths. PDEs等国际数值分析与科学计算领域期刊的副主编。