Jaehoon Lee

Affiliations:
  • Google Brain


According to our database1, Jaehoon Lee authored at least 18 papers between 2018 and 2023.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
Beyond Human Data: Scaling Self-Training for Problem-Solving with Language Models.
CoRR, 2023

Frontier Language Models are not Robust to Adversarial Arithmetic, or "What do I need to say so you agree 2+2=5?
CoRR, 2023

Small-scale proxies for large-scale Transformer training instabilities.
CoRR, 2023

Kernel Regression with Infinite-Width Neural Networks on Millions of Examples.
CoRR, 2023

2022
Fast Neural Kernel Embeddings for General Activations.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

2021
Explaining Neural Scaling Laws.
CoRR, 2021

Dataset Distillation with Infinitely Wide Convolutional Networks.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the Infinite-Width Limit.
Proceedings of the 9th International Conference on Learning Representations, 2021

2020
Towards NNGP-guided Neural Architecture Search.
CoRR, 2020

On the infinite width limit of neural networks with a standard parameterization.
CoRR, 2020

Finite Versus Infinite Neural Networks: an Empirical Study.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020

Neural Tangents: Fast and Easy Infinite Neural Networks in Python.
Proceedings of the 8th International Conference on Learning Representations, 2020

2019
Measuring the Effects of Data Parallelism on Neural Network Training.
J. Mach. Learn. Res., 2019

Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent.
CoRR, 2019

Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent.
Proceedings of the Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, 2019

Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes.
Proceedings of the 7th International Conference on Learning Representations, 2019

2018
Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes.
CoRR, 2018

Deep Neural Networks as Gaussian Processes.
Proceedings of the 6th International Conference on Learning Representations, 2018


  Loading...