Junseo Cha

Orcid: 0009-0004-4415-8552

According to our database1, Junseo Cha authored at least 4 papers between 2023 and 2025.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
Hybe: GPU-NPU Hybrid System for Efficient LLM Inference with Million-Token Context Window.
Proceedings of the 52nd Annual International Symposium on Computer Architecture, 2025

2024
A Latency Processing Unit: A Latency-Optimized and Highly Scalable Processor for Large Language Model Inference.
IEEE Micro, 2024

LPU: A Latency-Optimized and Highly Scalable Processor for Large Language Model Inference.
CoRR, 2024

2023
HyperAccel Latency Processing Unit (LPU<sup>TM</sup>) Accelerating Hyperscale Models for Generative AI.
Proceedings of the 35th IEEE Hot Chips Symposium, 2023


  Loading...