Lu Lu

Orcid: 0000-0002-5476-5768

Affiliations:
  • Brown University, Division of Applied Mathematics, Providence, RI, USA


According to our database1, Lu Lu authored at least 26 papers between 2017 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Identifying heterogeneous micromechanical properties of biological tissues via physics-informed neural networks.
CoRR, 2024

2023
Neural operator prediction of linear instability waves in high-speed boundary layers.
J. Comput. Phys., February, 2023

Deep Learning for Solving and Estimating Dynamic Macro-Finance Models.
CoRR, 2023

Fourier-MIONet: Fourier-enhanced multiple-input neural operators for multiphase modeling of geological carbon sequestration.
CoRR, 2023

2022
MIONet: Learning Multiple-Input Operators via Tensor Product.
SIAM J. Sci. Comput., 2022

Approximation rates of DeepONets for learning operators arising from advection-diffusion equations.
Neural Networks, 2022

Reliable extrapolation of deep neural operators informed by physics or sparse observations.
CoRR, 2022

Effective Data Sampling Strategies and Boundary Condition Constraints of Physics-Informed Neural Networks for Identifying Material Properties in Solid Mechanics.
CoRR, 2022

Systems Biology: Identifiability analysis and parameter identification via systems-biology informed neural networks.
CoRR, 2022

2021
Physics-Informed Neural Networks with Hard Constraints for Inverse Design.
SIAM J. Sci. Comput., 2021

DeepXDE: A Deep Learning Library for Solving Differential Equations.
SIAM Rev., 2021

How the spleen reshapes and retains young and old red blood cells: A computational investigation.
PLoS Comput. Biol., 2021

Deep transfer learning and data augmentation improve glucose levels prediction in type 2 diabetes patients.
npj Digit. Medicine, 2021

Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators.
Nat. Mach. Intell., 2021

DeepM&Mnet for hypersonics: Predicting the coupled flow and finite-rate chemistry behind a normal shock using neural-network approximation of operators.
J. Comput. Phys., 2021

DeepM&Mnet: Inferring the electroconvection multiphysics fields based on operator approximation by neural networks.
J. Comput. Phys., 2021

Gradient-enhanced physics-informed neural networks for forward and inverse PDE problems.
CoRR, 2021

Convergence rate of DeepONets for learning operators arising from advection-diffusion equations.
CoRR, 2021

2020
Systems biology informed deep learning for inferring parameters and hidden dynamics.
PLoS Comput. Biol., 2020

Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness.
Neural Networks, 2020

2019
fPINNs: Fractional Physics-Informed Neural Networks.
SIAM J. Sci. Comput., 2019

Quantifying total uncertainty in physics-informed neural networks for solving forward and inverse stochastic problems.
J. Comput. Phys., 2019

DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators.
CoRR, 2019

Dying ReLU and Initialization: Theory and Numerical Examples.
CoRR, 2019

2018
Collapse of Deep and Narrow Neural Nets.
CoRR, 2018

2017
OpenRBC: A Fast Simulator of Red Blood Cells at Protein Resolution.
CoRR, 2017


  Loading...