Qing Xu

Orcid: 0000-0002-9202-1073

Affiliations:
  • Institute for Infocomm Research, Singapore


According to our database1, Qing Xu authored at least 10 papers between 2022 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2025
From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks.
IEEE Trans. Neural Networks Learn. Syst., April, 2025

2024
Reinforced Knowledge Distillation for Time Series Regression.
IEEE Trans. Artif. Intell., June, 2024

LLM-based Knowledge Pruning for Time Series Data Analytics on Edge-computing Devices.
CoRR, 2024

From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks.
CoRR, 2024

Improve Knowledge Distillation via Label Revision and Data Selection.
CoRR, 2024

Reinforced Cross-Domain Knowledge Distillation on Time Series Data.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

2023
A Hybrid Ensemble Deep Learning Approach for Early Prediction of Battery Remaining Useful Life.
IEEE CAA J. Autom. Sinica, 2023

Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data.
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, 2023

2022
KDnet-RUL: A Knowledge Distillation Framework to Compress Deep Neural Networks for Machine Remaining Useful Life Prediction.
IEEE Trans. Ind. Electron., 2022

Contrastive adversarial knowledge distillation for deep model compression in time-series regression tasks.
Neurocomputing, 2022


  Loading...