Zhiyuan Wang

Orcid: 0000-0001-7167-9055

Affiliations:
  • University of Electronic Science and Technology of China, China


According to our database1, Zhiyuan Wang authored at least 10 papers between 2022 and 2023.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2023
Spatial-Temporal Contrasting for Fine-Grained Urban Flow Inference.
IEEE Trans. Big Data, December, 2023

Reservoir Inflow Forecasting in Hydropower Industry: A Generative Flow-Based Approach.
IEEE Trans. Ind. Informatics, 2023

Dynamic transformer ODEs for large-scale reservoir inflow forecasting.
Knowl. Based Syst., 2023

DyCVAE: Learning Dynamic Causal Factors for Non-stationary Series Domain Generalization (Student Abstract).
Proceedings of the Thirty-Seventh AAAI Conference on Artificial Intelligence, 2023

Learning Dynamic Temporal Relations with Continuous Graph for Multivariate Time Series Forecasting (Student Abstract).
Proceedings of the Thirty-Seventh AAAI Conference on Artificial Intelligence, 2023

2022
HydroFlow: Towards probabilistic electricity demand prediction using variational autoregressive models and normalizing flows.
Int. J. Intell. Syst., 2022

Learning Latent Seasonal-Trend Representations for Time Series Forecasting.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

Connecting the Hosts: Street-Level IP Geolocation with Graph Neural Networks.
Proceedings of the KDD '22: The 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, August 14, 2022

Large-Scale IP Usage Identification via Deep Ensemble Learning (Student Abstract).
Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence, 2022

PrEF: Probabilistic Electricity Forecasting via Copula-Augmented State Space Model.
Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence, 2022


  Loading...