Yuzhe Zi
According to our database1,
Yuzhe Zi
authored at least 4 papers
between 2024 and 2025.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2025
Balancing Forget Quality and Model Utility: A Reverse KL-Divergence Knowledge Distillation Approach for Better Unlearning in LLMs.
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies, 2025
2024
RKLD: Reverse KL-Divergence-based Knowledge Distillation for Unlearning Personal Information in Large Language Models.
CoRR, 2024
Proceedings of the 2024 Joint International Conference on Computational Linguistics, 2024
Scale-CoT: Integrating LLM with Psychiatric Scales for Analyzing Mental Health Issues on Social Media.
Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine, 2024