Berfin Simsek

According to our database1, Berfin Simsek authored at least 13 papers between 2019 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
The Loss Landscape of Shallow ReLU-like Neural Networks: Stationary Points, Saddle Escaping, and Network Embedding.
CoRR, 2024

Expand-and-Cluster: Parameter Recovery of Neural Networks.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

Learning Associative Memories with Gradient Descent.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

2023
Expand-and-Cluster: Exact Parameter Recovery of Neural Networks.
CoRR, 2023

MLPGradientFlow: going with the flow of multilayer perceptrons (and finding minima fast and accurately).
CoRR, 2023

Should Under-parameterized Student Networks Copy or Average Teacher Weights?
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

2022
Understanding out-of-distribution accuracies through quantifying difficulty of test samples.
CoRR, 2022

2021
Deep Linear Networks Dynamics: Low-Rank Biases Induced by Initialization Scale and L2 Regularization.
CoRR, 2021

Geometry of the Loss Landscape in Overparameterized Neural Networks: Symmetries and Invariances.
Proceedings of the 38th International Conference on Machine Learning, 2021

2020
Kernel Alignment Risk Estimator: Risk Prediction from Training Data.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020

Implicit Regularization of Random Feature Models.
Proceedings of the 37th International Conference on Machine Learning, 2020

2019
Weight-space symmetry in deep networks gives rise to permutation saddles, connected by equal-loss valleys across the loss landscape.
CoRR, 2019

Online Bounded Component Analysis: A Simple Recurrent Neural Network with Local Update Rule for Unsupervised Separation of Dependent and Independent Sources.
Proceedings of the 53rd Asilomar Conference on Signals, Systems, and Computers, 2019


  Loading...