Tao Luo

Orcid: 0000-0002-2029-0362

Affiliations:
  • Shanghai Jiao Tong University, China
  • Hong Kong University of Science and Technology, Department of Mathematics, Hong Kong (former)


According to our database1, Tao Luo authored at least 33 papers between 2016 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
A priori Estimates for Deep Residual Network in Continuous-time Reinforcement Learning.
CoRR, 2024

2023
On Residual Minimization for PDEs: Failure of PINN, Modified Equation, and Implicit Bias.
CoRR, 2023

Structure and Gradient Dynamics Near Global Minima of Two-layer Neural Networks.
CoRR, 2023

Optimistic Estimate Uncovers the Potential of Nonlinear Models.
CoRR, 2023

Stochastic Modified Equations and Dynamics of Dropout Algorithm.
CoRR, 2023

Phase Diagram of Initial Condensation for Two-layer Neural Networks.
CoRR, 2023

2022
On the Exact Computation of Linear Frequency Principle Dynamics and Its Generalization.
SIAM J. Math. Data Sci., 2022

Nonlinear Weighted Directed Acyclic Graph and A Priori Estimates for Neural Networks.
SIAM J. Math. Data Sci., 2022

A regularised deep matrix factorised model of matrix completion for image restoration.
IET Image Process., 2022

Linear Stability Hypothesis and Rank Stratification for Nonlinear Models.
CoRR, 2022

Embedding Principle in Depth for the Loss Landscape Analysis of Deep Neural Networks.
CoRR, 2022

An Experimental Comparison Between Temporal Difference and Residual Gradient with Neural Network Approximation.
CoRR, 2022

Limitation of characterizing implicit regularization by data-independent functions.
CoRR, 2022

Overview frequency principle/spectral bias in deep learning.
CoRR, 2022

Towards Understanding the Condensation of Neural Networks at Initial Training.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Empirical Phase Diagram for Three-layer Neural Networks with Infinite Width.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

An Upper Limit of Decaying Rate with Respect to Frequency in Linear Frequency Principle Model.
Proceedings of the Mathematical and Scientific Machine Learning, 2022

2021
Energy Scaling and Asymptotic Properties of One-Dimensional Discrete System with Generalized Lennard-Jones (m, n) Interaction.
J. Nonlinear Sci., 2021

Phase Diagram for Two-layer ReLU Neural Networks at Infinite-width Limit.
J. Mach. Learn. Res., 2021

Embedding Principle: a hierarchical structure of loss landscape of deep neural networks.
CoRR, 2021

MOD-Net: A Machine Learning Approach via Model-Operator-Data Network for Solving PDEs.
CoRR, 2021

Towards Understanding the Condensation of Two-layer Neural Networks at Initial Training.
CoRR, 2021

An Upper Limit of Decaying Rate with Respect to Frequency in Deep Neural Network.
CoRR, 2021

Linear Frequency Principle Model to Understand the Absence of Overfitting in Neural Networks.
CoRR, 2021

Embedding Principle of Loss Landscape of Deep Neural Networks.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

2020
Fourier-domain Variational Formulation and Its Well-posedness for Supervised Learning.
CoRR, 2020

A regularized deep matrix factorized model of matrix completion for image restoration.
CoRR, 2020

Towards an Understanding of Residual Networks Using Neural Tangent Hierarchy (NTH).
CoRR, 2020

A type of generalization error induced by initialization in deep neural networks.
Proceedings of Mathematical and Scientific Machine Learning, 2020

2019
Theory of the Frequency Principle for General Deep Neural Networks.
CoRR, 2019

Explicitizing an Implicit Bias of the Frequency Principle in Two-layer Neural Networks.
CoRR, 2019

Frequency Principle: Fourier Analysis Sheds Light on Deep Neural Networks.
CoRR, 2019

2016
Energy Scaling and Asymptotic Properties of Step Bunching in Epitaxial Growth with Elasticity Effects.
Multiscale Model. Simul., 2016


  Loading...