Itay Safran

Affiliations:
  • Princeton University, NJ, USA
  • Weizmann Institute of Science, Rehovot, Israel


According to our database1, Itay Safran authored at least 13 papers between 2016 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Depth Separations in Neural Networks: Separating the Dimension from the Accuracy.
CoRR, 2024

How Many Neurons Does it Take to Approximate the Maximum?
Proceedings of the 2024 ACM-SIAM Symposium on Discrete Algorithms, 2024

2022
On the Effective Number of Linear Regions in Shallow Univariate ReLU Networks: Convergence Guarantees and Implicit Bias.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Optimization-Based Separations for Neural Networks.
Proceedings of the Conference on Learning Theory, 2-5 July 2022, London, UK., 2022

2021
Random Shuffling Beats SGD Only After Many Epochs on Ill-Conditioned Problems.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

The Effects of Mild Over-parameterization on the Optimization Landscape of Shallow ReLU Neural Networks.
Proceedings of the Conference on Learning Theory, 2021

2020
How Good is SGD with Random Shuffling?
Proceedings of the Conference on Learning Theory, 2020

2019
A Simple Explanation for the Existence of Adversarial Examples with Small Hamming Distance.
CoRR, 2019

Depth Separations in Neural Networks: What is Actually Being Separated?
Proceedings of the Conference on Learning Theory, 2019

2018
Spurious Local Minima are Common in Two-Layer ReLU Neural Networks.
Proceedings of the 35th International Conference on Machine Learning, 2018

2017
Depth-Width Tradeoffs in Approximating Natural Functions with Neural Networks.
Proceedings of the 34th International Conference on Machine Learning, 2017

2016
Depth Separation in ReLU Networks for Approximating Smooth Non-Linear Functions.
CoRR, 2016

On the Quality of the Initial Basin in Overspecified Neural Networks.
Proceedings of the 33nd International Conference on Machine Learning, 2016


  Loading...