Wenyang Hu

Orcid: 0009-0008-6189-7890

According to our database1, Wenyang Hu authored at least 13 papers between 2020 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Modeling of Interturn Short Fault in Interior Permanent Magnet Synchronous Motors With Multistrand Windings.
IEEE Trans. Ind. Electron., November, 2024

Transparent Operator Network: A Fully Interpretable Network Incorporating Learnable Wavelet Operator for Intelligent Fault Diagnosis.
IEEE Trans. Ind. Informatics, June, 2024

Data-Centric AI in the Age of Large Language Models.
CoRR, 2024

Prompt Optimization with EASE? Efficient Ordering-aware Automated Selection of Exemplars.
CoRR, 2024

Localized Zeroth-Order Prompt Optimization.
CoRR, 2024

Use Your INSTINCT: INSTruction optimization for LLMs usIng Neural bandits Coupled with Transformers.
Proceedings of the Forty-first International Conference on Machine Learning, 2024

2023
Novel Ramanujan Digital Twin for Motor Periodic Fault Monitoring and Detection.
IEEE Trans. Ind. Informatics, December, 2023

Matching contrastive learning: An effective and intelligent method for wind turbine fault diagnosis with imbalanced SCADA data.
Expert Syst. Appl., August, 2023

A Spatial-Temporal Transformer based Framework For Human Pose Assessment And Correction in Education Scenarios.
CoRR, 2023

Use Your INSTINCT: INSTruction optimization usIng Neural bandits Coupled with Transformers.
CoRR, 2023

A Wasserstein generative digital twin model in health monitoring of rotating machines.
Comput. Ind., 2023

2022
Fault Feature Recovery With Wasserstein Generative Adversarial Imputation Network With Gradient Penalty for Rotating Machine Health Monitoring Under Signal Loss Condition.
IEEE Trans. Instrum. Meas., 2022

2020
GTC: Guided Training of CTC towards Efficient and Accurate Scene Text Recognition.
Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence, 2020


  Loading...