Mechanical Engineering Graduate Seminar Series: Prof. Ameya Jagtap, "Advancing Computational Methods: Physics-Informed Neural Networks and Neural Operator Networks"
10:00 am
Abstract : Classical methods in scientific computation have achieved significant advancements, yet they face substantial challenges, including the need for precise knowledge of underlying physical laws, an accurate understanding of boundary and initial conditions, and the involvement of time-consuming processes such as mesh generation and extensive simulations. Furthermore, these approaches are often inadequate for addressing high-dimensional problems governed by parameterized partial differential equations (PDEs), rendering them impractical in many applications. Physics-informed machine learning (PIML) presents a promising alternative to these traditional methods. This presentation focuses on a specific PIML approach known as physics-informed neural networks (PINNs). We provide a comprehensive overview of the current capabilities and limitations of PINNs, illustrating their effectiveness across diverse applications when compared to traditional computational methods. Additionally, we examine extensions of the standard PINN methodology, such as conservative PINNs and extended PINNs, which are tailored to handle big data and large-scale models. Our discussion also includes the role of various adaptive activation functions designed to expedite the convergence of deep, physics-informed neural networks. Furthermore, we explore recent advancements in deep operator networks, a novel class of neural operators that learn mappings between infinite-dimensional function spaces. Traditional PDE solvers are often both time-consuming and computationally intensive, especially when applied to complex systems. In contrast, neural operators have demonstrated superior performance in solving PDEs, achieving this while significantly reducing computational time compared to conventional numerical solvers. To this end, we investigate the application of deep operator networks in addressing stiff chemical kinetics problems and the use of innovative architectures for learning data-driven basis functions to map between discontinuous solutions.
Bio : Ameya D. Japtap is an Assistant Professor in the Aerospace Engineering Department at WPI. Before joining WPI, he was an Assistant Professor of Applied Mathematics (Research) at Brown University. He earned his PhD and Master’s degrees in Aerospace Engineering from the Indian Institute of Science, India. Following his doctoral studies, he served as a postdoctoral research fellow at the Tata Institute of Fundamental Research - Centre for Applicable Mathematics before undertaking postdoctoral research in applied mathematics at Brown University. His research focuses on the development of data- and physics-driven scientific machine-learning algorithms for a broad spectrum of problems in computational physics. His expertise includes Scientific Machine Learning, Deep Learning, Data/Physics-driven Deep Learning Techniques, Multi-scale/Multi-Physics Simulations, Spectral/Finite Element Methods, and WENO/DG Schemes. He is serving on the editorial board of several journals, including Neurocomputing and Neural Networks, both published by Elsevier