Matthieu Wyart

Orcid: 0000-0003-0644-0990

According to our database1, Matthieu Wyart authored at least 27 papers between 2010 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
A Phase Transition in Diffusion Models Reveals the Hierarchical Nature of Data.
CoRR, 2024

2023
How deep convolutional neural networks lose spatial information with training.
Mach. Learn. Sci. Technol., December, 2023

On the different regimes of Stochastic Gradient Descent.
CoRR, 2023

How Deep Neural Networks Learn Compositional Data: The Random Hierarchy Model.
CoRR, 2023

Dissecting the Effects of SGD Noise in Distinct Regimes of Deep Learning.
Proceedings of the International Conference on Machine Learning, 2023

What Can Be Learnt With Wide Convolutional Neural Networks?
Proceedings of the International Conference on Machine Learning, 2023

2022
How Wide Convolutional Neural Networks Learn Hierarchical Tasks.
CoRR, 2022

Failure and success of the spectral bias prediction for Kernel Ridge Regression: the case of low-dimensional data.
CoRR, 2022

Learning sparse features can lead to overfitting in neural networks.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Failure and success of the spectral bias prediction for Laplace Kernel Ridge Regression: the case of low-dimensional data.
Proceedings of the International Conference on Machine Learning, 2022

2021
How isotropic kernels perform on simple invariants.
Mach. Learn. Sci. Technol., 2021

How memory architecture affects performance and learning in simple POMDPs.
CoRR, 2021

Relative stability toward diffeomorphisms in deep nets indicates performance.
CoRR, 2021

Relative stability toward diffeomorphisms indicates performance in deep nets.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

Locality defeats the curse of dimensionality in convolutional teacher-student scenarios.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

2020
Direct coupling analysis of epistasis in allosteric materials.
PLoS Comput. Biol., 2020

Perspective: A Phase Diagram for Deep Learning unifying Jamming, Feature Learning and Lazy Training.
CoRR, 2020

Compressing invariant manifolds in neural nets.
CoRR, 2020

How isotropic kernels learn simple invariants.
CoRR, 2020

2019
Disentangling feature and lazy learning in deep neural networks: an empirical study.
CoRR, 2019

Asymptotic learning curves of kernel methods: empirical data v.s. Teacher-Student paradigm.
CoRR, 2019

Scaling description of generalization with number of parameters in deep learning.
CoRR, 2019

2018
A jamming transition from under- to over-parametrization affects loss landscape and generalization.
CoRR, 2018

The jamming transition as a paradigm to understand the loss landscape of deep neural networks.
CoRR, 2018

Comparing Dynamics: Deep Neural Networks versus Glassy Systems.
Proceedings of the 35th International Conference on Machine Learning, 2018

2013
Simulations of driven overdamped frictionless hard spheres.
Comput. Phys. Commun., 2013

2010
Evaluating Gene Expression Dynamics Using Pairwise RNA FISH Data.
PLoS Comput. Biol., 2010


  Loading...