Gang Wang
Orcid: 0009-0003-6944-2958Affiliations:
- School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Shanghai, China
According to our database1,
Gang Wang
authored at least 10 papers
between 2023 and 2025.
Collaborative distances:
Collaborative distances:
Timeline
Legend:
Book In proceedings Article PhD thesis Dataset OtherLinks
Online presence:
-
on orcid.org
On csauthors.net:
Bibliography
2025
OFQ-LLM: Outlier-Flexing Quantization for Efficient Low-Bit Large Language Model Acceleration.
IEEE Trans. Circuits Syst. I Regul. Pap., August, 2025
An Efficient Multi-View Cross-Attention Accelerator for Vision-Centric 3D Perception in Autonomous Driving.
IEEE Trans. Circuits Syst. I Regul. Pap., July, 2025
Efficient Hardware Architecture Design for Rotary Position Embedding of Large Language Models.
IEEE J. Emerg. Sel. Topics Circuits Syst., June, 2025
Adaptive Two-Range Quantization and Hardware Co-Design for Large Language Model Acceleration.
IEEE J. Emerg. Sel. Topics Circuits Syst., June, 2025
COSA Plus: Enhanced Co-Operative Systolic Arrays for Attention Mechanism in Transformers.
IEEE Trans. Comput. Aided Des. Integr. Circuits Syst., February, 2025
2024
BSViT: A Bit-Serial Vision Transformer Accelerator Exploiting Dynamic Patch and Weight Bit-Group Quantization.
IEEE Trans. Circuits Syst. I Regul. Pap., September, 2024
Hardware-oriented algorithms for softmax and layer normalization of large language models.
Sci. China Inf. Sci., 2024
DEFA: Efficient Deformable Attention Acceleration via Pruning-Assisted Grid-Sampling and Multi-Scale Parallel Processing.
Proceedings of the 61st ACM/IEEE Design Automation Conference, 2024
2023
Low-Complexity Precision-Scalable Multiply-Accumulate Unit Architectures for Deep Neural Network Accelerators.
IEEE Trans. Circuits Syst. II Express Briefs, April, 2023
COSA:Co-Operative Systolic Arrays for Multi-head Attention Mechanism in Neural Network using Hybrid Data Reuse and Fusion Methodologies.
Proceedings of the 60th ACM/IEEE Design Automation Conference, 2023