Haoyang Qu
Orcid: 0009-0008-7893-6163
According to our database1,
Haoyang Qu
authored at least 4 papers
between 2024 and 2025.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2025
IEEE Trans. Computers, August, 2025
IMPRESS: An Importance-Informed Multi-Tier Prefix KV Storage System for Large Language Model Inference.
Proceedings of the 23rd USENIX Conference on File and Storage Technologies, 2025
LeapGNN: Accelerating Distributed GNN Training Leveraging Feature-Centric Model Migration.
Proceedings of the 23rd USENIX Conference on File and Storage Technologies, 2025
2024
HopGNN: Boosting Distributed GNN Training Efficiency via Feature-Centric Model Migration.
CoRR, 2024