Michael Crawshaw

According to our database1, Michael Crawshaw authored at least 11 papers between 2020 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability.
CoRR, June, 2025

Local Steps Speed Up Local GD for Heterogeneous Distributed Logistic Regression.
Proceedings of the Thirteenth International Conference on Learning Representations, 2025

Complexity Lower Bounds of Adaptive Gradient Algorithms for Non-convex Stochastic Optimization under Relaxed Smoothness.
Proceedings of the Thirteenth International Conference on Learning Representations, 2025

2024
Federated Learning under Periodic Client Participation and Heterogeneous Data: A New Communication-Efficient Algorithm and Analysis.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

Provable Benefits of Local Steps in Heterogeneous Federated Learning for Neural Networks: A Feature Learning Perspective.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

2023
Federated Learning with Client Subsampling, Data Heterogeneity, and Unbounded Smoothness: A New Algorithm and Lower Bounds.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

EPISODE: Episodic Gradient Clipping with Periodic Resampled Corrections for Federated Learning with Heterogeneous Data.
Proceedings of the Eleventh International Conference on Learning Representations, 2023

2022
Robustness to Unbounded Smoothness of Generalized SignSGD.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Fast Composite Optimization and Statistical Recovery in Federated Learning.
Proceedings of the International Conference on Machine Learning, 2022

2021
SLAW: Scaled Loss Approximate Weighting for Efficient Multi-Task Learning.
CoRR, 2021

2020
Multi-Task Learning with Deep Neural Networks: A Survey.
CoRR, 2020


  Loading...