Yongjie Lv
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2022
Self-Distillation Based on High-level Information Supervision for Compressing End-to-End ASR Model.
Proceedings of the Interspeech 2022, 2022
Proceedings of the Interspeech 2022, 2022
Compressing Transformer-Based ASR Model by Task-Driven Loss and Attention-Based Multi-Level Feature Distillation.
Proceedings of the IEEE International Conference on Acoustics, 2022