Yunho Jin

Orcid: 0000-0002-0292-3322

According to our database1, Yunho Jin authored at least 8 papers between 2021 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
S<sup>3</sup>: Increasing GPU Utilization during Generative Inference for Higher Throughput.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

2022
Architecting a Flash-Based Storage System for Low-Cost Inference of Extreme-Scale DNNs.
IEEE Trans. Computers, 2022

Layerweaver+: A QoS-Aware Layer-Wise DNN Scheduler for Multi-Tenant Neural Processing Units.
IEICE Trans. Inf. Syst., 2022

Bigger&Faster: Two-stage Neural Architecture Search for Quantized Transformer Models.
CoRR, 2022

2021
ASAP: Fast Mobile Application Switch via Adaptive Prepaging.
Proceedings of the 2021 USENIX Annual Technical Conference, 2021

Layerweaver: Maximizing Resource Utilization of Neural Processing Units via Layer-Wise Scheduling.
Proceedings of the IEEE International Symposium on High-Performance Computer Architecture, 2021

Behemoth: A Flash-centric Training Accelerator for Extreme-scale DNNs.
Proceedings of the 19th USENIX Conference on File and Storage Technologies, 2021

FlashNeuron: SSD-Enabled Large-Batch Training of Very Deep Neural Networks.
Proceedings of the 19th USENIX Conference on File and Storage Technologies, 2021


  Loading...