Eldar Kurtic

According to our database1, Eldar Kurtic authored at least 17 papers between 2021 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
How to Prune Your Language Model: Recovering Accuracy on the "Sparsity May Cry" Benchmark.
CoRR, 2023

Sparse Fine-tuning for Inference Acceleration of Large Language Models.
CoRR, 2023

Accurate Neural Network Pruning Requires Rethinking Sparse Optimization.
CoRR, 2023

Error Feedback Can Accurately Compress Preconditioners.
CoRR, 2023

Vision Models Can Be Efficiently Specialized via Few-Shot Task-Aware Compression.
CoRR, 2023

SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks.
CoRR, 2023

ZipLM: Hardware-Aware Structured Pruning of Language Models.
CoRR, 2023

CAP: Correlation-Aware Pruning for Highly-Accurate Sparse Vision Models.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

ZipLM: Inference-Aware Structured Pruning of Language Models.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks at the Edge.
Proceedings of the International Conference on Machine Learning, 2023

CrAM: A Compression-Aware Minimizer.
Proceedings of the Eleventh International Conference on Learning Representations, 2023

2022
oViT: An Accurate Second-Order Pruning Framework for Vision Transformers.
CoRR, 2022

GMP*: Well-Tuned Global Magnitude Pruning Can Outperform Most BERT-Pruning Methods.
CoRR, 2022

Vision for Bosnia and Herzegovina in Artificial Intelligence Age: Global Trends, Potential Opportunities, Selected Use-cases and Realistic Goals.
CoRR, 2022

The Optimal BERT Surgeon: Scalable and Accurate Second-Order Pruning for Large Language Models.
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022

2021
Efficient Matrix-Free Approximations of Second-Order Information, with Applications to Pruning and Optimization.
CoRR, 2021

M-FAC: Efficient Matrix-Free Approximations of Second-Order Information.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021


  Loading...