Yasaman Bahri

Affiliations:
  • Google, Mountain View, CA, USA


According to our database1, Yasaman Bahri authored at least 15 papers between 2017 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Quantum Many-Body Physics Calculations with Large Language Models.
CoRR, 2024

2023
Les Houches Lectures on Deep Learning at Large & Infinite Width.
CoRR, 2023

2022
The Evolution of Out-of-Distribution Robustness Throughout Fine-Tuning.
Trans. Mach. Learn. Res., 2022

2021
Explaining Neural Scaling Laws.
CoRR, 2021

2020
Exact posterior distributions of wide Bayesian neural networks.
CoRR, 2020

The large learning rate phase of deep learning: the catapult mechanism.
CoRR, 2020

Infinite attention: NNGP and NTK for deep attention networks.
Proceedings of the 37th International Conference on Machine Learning, 2020

2019
Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent.
CoRR, 2019

Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent.
Proceedings of the Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, 2019

Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes.
Proceedings of the 7th International Conference on Learning Representations, 2019

2018
Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes.
CoRR, 2018

Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10, 000-Layer Vanilla Convolutional Neural Networks.
Proceedings of the 35th International Conference on Machine Learning, 2018

Sensitivity and Generalization in Neural Networks: an Empirical Study.
Proceedings of the 6th International Conference on Learning Representations, 2018

Deep Neural Networks as Gaussian Processes.
Proceedings of the 6th International Conference on Learning Representations, 2018

2017
Geometry of Neural Network Loss Surfaces via Random Matrix Theory.
Proceedings of the 34th International Conference on Machine Learning, 2017


  Loading...