Gen Li

Affiliations:
  • Clemson University, Clemson, SC, USA


According to our database1, Gen Li authored at least 13 papers between 2023 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2025
The Right to be Forgotten in Pruning: Unveil Machine Unlearning on Sparse Models.
CoRR, July, 2025

Sculpting Memory: Multi-Concept Forgetting in Diffusion Models via Dynamic Mask and Concept-Aware Optimization.
CoRR, April, 2025

Optimal Transport for Brain-Image Alignment: Unveiling Redundancy and Synergy in Neural Information Processing.
CoRR, March, 2025

Adversarial Robust ViT-Based Automatic Modulation Recognition in Practical Deep Learning-Based Wireless Systems.
Proceedings of the IEEE Symposium on Security and Privacy, 2025

2024
Condense, Don't Just Prune: Enhancing Efficiency and Performance in MoE Layer Pruning.
CoRR, 2024

A Single-Step, Sharpness-Aware Minimization is All You Need to Achieve Efficient and Accurate Sparse Training.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

Advancing Dynamic Sparse Training by Exploring Optimization Opportunities.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for Pruning LLMs to High Sparsity.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

NeurRev: Train Better Sparse Neural Network Practically via Neuron Revitalization.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

Data Overfitting for On-device Super-Resolution with Dynamic Algorithm and Compiler Co-design.
Proceedings of the Computer Vision - ECCV 2024, 2024

2023
Coupling Fairness and Pruning in a Single Run: a Bi-level Optimization Perspective.
CoRR, 2023

Dynamic Sparsity Is Channel-Level Sparsity Learner.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

Towards High-Quality and Efficient Video Super-Resolution via Spatial-Temporal Data Overfitting.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023


  Loading...