Jingkai He
Orcid: 0009-0005-9024-7588Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
On csauthors.net:
Bibliography
2025
CoRR, August, 2025
InferLog: Accelerating LLM Inference for Online Log Parsing via ICL-oriented Prefix Caching.
CoRR, July, 2025
HeteroPod: XPU-Accelerated Infrastructure Offloading for Commodity Cloud-Native Applications.
CoRR, March, 2025
Proceedings of the ACM SIGOPS 31st Symposium on Operating Systems Principles, 2025
LLMConf: Knowledge-Enhanced Configuration Optimization for Large Language Model Inference.
Proceedings of the 33rd IEEE/ACM International Symposium on Quality of Service, 2025