Mher Safaryan

Orcid: 0000-0001-6290-1398

According to our database1, Mher Safaryan authored at least 13 papers between 2020 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
AsGrad: A Sharp Unified Analysis of Asynchronous-SGD Algorithms.
CoRR, 2023

Knowledge Distillation Performs Partial Variance Reduction.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

2022
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity.
CoRR, 2022

Distributed Newton-Type Methods with Communication Compression and Bernoulli Aggregation.
CoRR, 2022

Theoretically Better and Numerically Faster Distributed Optimization with Smoothness-Aware Quantization Techniques.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

FedNL: Making Newton-Type Methods Applicable to Federated Learning.
Proceedings of the International Conference on Machine Learning, 2022

Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2022

2021
Smoothness-Aware Quantization Techniques.
CoRR, 2021

Smoothness Matrices Beat Smoothness Constants: Better Communication Compression Techniques for Distributed Optimization.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

Stochastic Sign Descent Methods: New Algorithms and Better Theory.
Proceedings of the 38th International Conference on Machine Learning, 2021

2020
Optimal Gradient Compression for Distributed and Federated Learning.
CoRR, 2020

On Biased Compression for Distributed Learning.
CoRR, 2020

Uncertainty Principle for Communication Compression in Distributed and Federated Learning and the Search for an Optimal Compressor.
CoRR, 2020


  Loading...