Ahmed Khaled

Orcid: 0000-0002-2217-4832

Affiliations:
  • Princeton University, NJ, USA
  • Cairo University, Egypt (former)


According to our database1, Ahmed Khaled authored at least 16 papers between 2019 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Directional Smoothness and Gradient Methods: Convergence and Adaptivity.
CoRR, 2024

Tuning-Free Stochastic Optimization.
CoRR, 2024

2023
Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization.
J. Optim. Theory Appl., November, 2023

Better Theory for SGD in the Nonconvex World.
Trans. Mach. Learn. Res., 2023

DoWG Unleashed: An Efficient Universal Parameter-Free Gradient Descent Method.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

Faster federated optimization under second-order similarity.
Proceedings of the Eleventh International Conference on Learning Representations, 2023

2022
Federated Optimization Algorithms with Random Reshuffling and Gradient Compression.
CoRR, 2022

Proximal and Federated Random Reshuffling.
Proceedings of the International Conference on Machine Learning, 2022

FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2022

2020
Applying fast matrix multiplication to neural networks.
Proceedings of the SAC '20: The 35th ACM/SIGAPP Symposium on Applied Computing, online event, [Brno, Czech Republic], March 30, 2020

Random Reshuffling: Simple Analysis with Vast Improvements.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020

Tighter Theory for Local SGD on Identical and Heterogeneous Data.
Proceedings of the 23rd International Conference on Artificial Intelligence and Statistics, 2020

2019
Distributed Fixed Point Methods with Compressed Iterates.
CoRR, 2019

Better Communication Complexity for Local SGD.
CoRR, 2019

Gradient Descent with Compressed Iterates.
CoRR, 2019

First Analysis of Local GD on Heterogeneous Data.
CoRR, 2019


  Loading...