Haoran Huang

Orcid: 0000-0003-3293-5380

According to our database1, Haoran Huang authored at least 27 papers between 2015 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Emotion, Cultural Valuation of Being Human and AI Services.
IEEE Trans. Engineering Management, 2024

DACO: Towards Application-Driven and Comprehensive Data Analysis via Code Generation.
CoRR, 2024

2023
End-to-End Monocular Pose Estimation for Uncooperative Spacecraft Based on Direct Regression Network.
IEEE Trans. Aerosp. Electron. Syst., October, 2023

Industrial few-shot fractal object detection.
Neural Comput. Appl., October, 2023

A novel feature and sample joint transfer learning method with feature selection in semi-supervised scenarios for identifying the sequence of some species with less known genetic data.
Soft Comput., May, 2023

Improving Generalization of Alignment with Human Preferences through Group Invariant Learning.
CoRR, 2023

Secrets of RLHF in Large Language Models Part I: PPO.
CoRR, 2023

A Simple LTE-R Resource Allocation Scheme for Relay-Assisted Railway Communications.
Proceedings of the IEEE Wireless Communications and Networking Conference, 2023

Video Noise Removal Using Progressive Decomposition With Conditional Invertibility.
Proceedings of the IEEE International Conference on Multimedia and Expo, 2023

Deep Video Demoiréing via Compact Invertible Dyadic Decomposition.
Proceedings of the IEEE/CVF International Conference on Computer Vision, 2023

Design of Smart Home Monitoring System for Elderly Care Based on Raspberry Pi.
Proceedings of the IEEE International Conference on High Performance Computing & Communications, 2023

2022
Pattern Recognition and Segmentation of Administrative Boundaries Using a One-Dimensional Convolutional Neural Network and Grid Shape Context Descriptor.
ISPRS Int. J. Geo Inf., 2022

Two-Level Supervised Contrastive Learning for Response Selection in Multi-Turn Dialogue.
CoRR, 2022

2021
Building Typification in Map Generalization Using Affinity Propagation Clustering.
ISPRS Int. J. Geo Inf., 2021

2020
Spelling Error Correction with Soft-Masked BERT.
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020

2019
A classification model for lncRNA and mRNA based on k-mers and a convolutional neural network.
BMC Bioinform., 2019

A Method of Multi-Attribute Decision Making With Double-Reference Points and its Application in Location of Agricultural Products Logistics Center.
IEEE Access, 2019

2017
Predicting Which Topics You Will Join in the Future on Social Media.
Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2017

Hashtag Recommendation for Multimodal Microblog Using Co-Attention Network.
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, 2017

Mention Recommendation for Twitter with End-to-end Memory Network.
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, 2017

Part-of-Speech Tagging for Twitter with Adversarial Neural Networks.
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, 2017

2016
A method of fuzzy multiple attribute decision making based on the error-eliminating theory.
J. Intell. Fuzzy Syst., 2016

Analysis of the Learning Mode of the Elaborate Resource Sharing Course.
Int. J. Emerg. Technol. Learn., 2016

Hashtag Recommendation Using End-To-End Memory Networks with Hierarchical Attention.
Proceedings of the COLING 2016, 2016

Retweet Prediction with Attention-based Deep Neural Network.
Proceedings of the 25th ACM International Conference on Information and Knowledge Management, 2016

Query Answering with Inconsistent Existential Rules under Stable Model Semantics.
Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, 2016

2015
Innovative Experiment Platform Design and Teaching Application of the Internet of Things.
Int. J. Online Eng., 2015


  Loading...