Shengxiang Hu

Orcid: 0009-0001-9988-404X

Affiliations:
  • Nanjing University of Science and Technology, School of Computer Science and Engineering, Nanjing, China


According to our database1, Shengxiang Hu authored at least 10 papers between 2023 and 2025.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2025
Diffusion-Refinement Pose Estimation With Hybrid Representation.
IEEE Trans. Instrum. Meas., 2025

LAL: Enhancing 3D Human Motion Prediction with Latency-aware Auxiliary Learning.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2025

ALIEN: Implicit Neural Representations for Human Motion Prediction under Arbitrary Latency.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2025

2024
Continuous Heatmap Regression for Pose Estimation via Implicit Neural Representation.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

NeRM: Learning Neural Representations for High-Framerate Human Motion Synthesis.
Proceedings of the Twelfth International Conference on Learning Representations, 2024

NeRMo: Learning Implicit Neural Representations for 3D Human Motion Prediction.
Proceedings of the Computer Vision - ECCV 2024, 2024

Fast Adaptation for Human Pose Estimation via Meta-Optimization.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024

Enhanced Fine-Grained Motion Diffusion for Text-Driven Human Motion Synthesis.
Proceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence, 2024

2023
Understanding Text-driven Motion Synthesis with Keyframe Collaboration via Diffusion Models.
CoRR, 2023

Human Joint Kinematics Diffusion-Refinement for Stochastic Motion Prediction.
Proceedings of the Thirty-Seventh AAAI Conference on Artificial Intelligence, 2023


  Loading...