Wanyi Zheng

According to our database1, Wanyi Zheng authored at least 3 papers between 2024 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
BucketServe: Bucket-Based Dynamic Batching for Smart and Efficient LLM Inference Serving.
CoRR, July, 2025

2024
UELLM: A Unified and Efficient Approach for LLM Inference Serving.
CoRR, 2024

UELLM: A Unified and Efficient Approach for Large Language Model Inference Serving.
Proceedings of the Service-Oriented Computing - 22nd International Conference, 2024


  Loading...