Month: September 2024

Sooting tendency measurements for formulating sustainable fuels that reduce soot emissions

Dr. Charles McEnallySpeaker: Dr. Charles McEnally – Yale University
Date: Sep 13, 2024; Time: 2:30 PM Location: PWEB 175

Abstract: The transition from fossil fuels to sustainable fuels offers a unique opportunity to select new fuel compositions that will not only reduce net carbon dioxide emissions, but also improve combustor performance and reduce emissions of other pollutants.  A particularly valuable goal is finding fuels that reduce soot emissions.  These emissions cause significant global warming, especially from aviation since soot particles are the nucleation site of contrails.  Furthermore, soot contributes to ambient fine particulates, which are responsible for millions of deaths worldwide each year.  Fortunately, soot formation rates depend sensitively on the molecular structure of the fuel, so fuel composition provides a strong lever for reducing emissions.  Sooting tendencies measured in laboratory-scale flames provide a scientific basis for selecting fuels that will maximize this benefit.  We have developed new techniques that expand the range of compounds that can be tested by reducing the required sample volume and increasing the dynamic range.  This has many benefits, but it is particularly essential for the development of structure-property relationships using machine learning algorithms: the accuracy and predictive ability of these relationships depends strongly on the number of compounds in the training set and the coverage of structural features.

 

Biographical Sketch: Charles received a Ph.D. in Mechanical Engineering from the University of California at Berkeley in 1994, where he studied with Catherine Koshland and the late Robert Sawyer.  Since then, he has been in the Chemical Engineering Department at Yale University where he works with Professor Lisa Pfefferle.  His research interest is combustion of sustainable fuels.

 

Learning neural operators accurately, efficiently, reliably, and in one shot

dr. lu

Speaker: Dr. Lu Lu – Yale University
Date: Sep 20, 2024; Time: 2:30 PM Location: PWEB 175

Abstract: As an emerging paradigm in scientific machine learning, deep neural operators pioneered by us can learn nonlinear operators of complex dynamic systems via neural networks. In this talk, I will present the deep operator network (DeepONet) to learn various operators that represent deterministic and stochastic differential equations. I will also present several extensions of DeepONet, such as DeepM&Mnet for multiphysics problems, DeepONet with proper orthogonal decomposition or Fourier decoder layers, MIONet for multiple-input operators, and multifidelity DeepONet. I will demonstrate the effectiveness of DeepONet and its extensions to diverse multiphysics and multiscale problems, such as bubble growth dynamics, high-speed boundary layers, electroconvection, hypersonics, geological carbon sequestration, full waveform inversion, and astrophysics. Deep learning models are usually limited to interpolation scenarios, and I will quantify the extrapolation complexity and develop a complete workflow to address the challenge of extrapolation for deep neural operators. Moreover, I will present the first operator learning method that only requires one PDE solution, i.e., one-shot learning, by introducing a new concept of local solution operator based on the principle of locality of PDEs.

Biographical Sketch: Dr. Lu Lu is an Assistant Professor in the Department of Statistics and Data Science at Yale University. Prior to joining Yale, he was an Assistant Professor in the Department of Chemical and Biomolecular Engineering at University of Pennsylvania from 2021 to 2023, and an Applied Mathematics Instructor in the Department of Mathematics at Massachusetts Institute of Technology from 2020 to 2021. He obtained his Ph.D. degree in Applied Mathematics at Brown University in 2020, master’s degrees in Engineering, Applied Mathematics, and Computer Science at Brown University, and bachelor’s degrees in Mechanical Engineering, Economics, and Computer Science at Tsinghua University in 2013. His current research interest lies in scientific machine learning, including theory, algorithms, software, and its applications to engineering, physical, and biological problems. His broad research interests focus on multiscale modeling and high performance computing for physical and biological systems. He has received the 2022 U.S. Department of Energy Early Career Award, and 2020 Joukowsky Family Foundation Outstanding Dissertation Award of Brown University. He is also an action editor of Journal of Machine Learning.