Wei Wang

Orcid: 0000-0002-7028-9845

Affiliations:
  • Alibaba Group Inc, Hangzhou, China


According to our database1, Wei Wang authored at least 29 papers between 2018 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Polyp-DAM: Polyp segmentation via depth anything model.
CoRR, 2024

2023
Achieving Human Parity on Visual Question Answering.
ACM Trans. Inf. Syst., 2023

ChatPLUG: Open-Domain Generative Dialogue System with Internet-Augmented Instruction Tuning for Digital Human.
CoRR, 2023

RRHF: Rank Responses to Align Language Models with Human Feedback without tears.
CoRR, 2023

How well do Large Language Models perform in Arithmetic tasks?
CoRR, 2023

Molecular Geometry-aware Transformer for accurate 3D Atomic System modeling.
CoRR, 2023

mPLUG-2: A Modularized Multi-modal Foundation Model Across Text, Image and Video.
Proceedings of the International Conference on Machine Learning, 2023

PEER: Pre-training ELECTRA Extended by Ranking.
Proceedings of the Findings of the Association for Computational Linguistics: ACL 2023, 2023

2022
mPLUG: Effective and Efficient Vision-Language Learning by Cross-modal Skip-connections.
CoRR, 2022

STRONGHOLD: Fast and Affordable Billion-Scale Deep Learning Model Training.
Proceedings of the SC22: International Conference for High Performance Computing, 2022

mPLUG: Effective and Efficient Vision-Language Learning by Cross-modal Skip-connections.
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022

2021
Achieving Human Parity on Visual Question Answering.
CoRR, 2021

Grid-VLP: Revisiting Grid Features for Vision-Language Pre-training.
CoRR, 2021

SemVLP: Vision-Language Pre-training by Aligning Semantics at Multiple Levels.
CoRR, 2021

VECO: Variable and Flexible Cross-lingual Pre-training for Language Understanding and Generation.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

Addressing Semantic Drift in Generative Question Answering with Auxiliary Extraction.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

StructuralLM: Structural Pre-training for Form Understanding.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

A Unified Pretraining Framework for Passage Ranking and Expansion.
Proceedings of the Thirty-Fifth AAAI Conference on Artificial Intelligence, 2021

2020
VECO: Variable Encoder-decoder Pre-training for Cross-lingual Understanding and Generation.
CoRR, 2020

PALM: Pre-training an Autoencoding&Autoregressive Language Model for Context-conditioned Generation.
CoRR, 2020

StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding.
Proceedings of the 8th International Conference on Learning Representations, 2020

PALM: Pre-training an Autoencoding&Autoregressive Language Model for Context-conditioned Generation.
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020

Generating Well-Formed Answers by Machine Reading with Stochastic Selector Networks.
Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence, 2020

2019
Symmetric Regularization based BERT for Pair-wise Semantic Reasoning.
CoRR, 2019

StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding.
CoRR, 2019

IDST at TREC 2019 Deep Learning Track: Deep Cascade Ranking with Generation-based Document Expansion and Pre-trained Language Modeling.
Proceedings of the Twenty-Eighth Text REtrieval Conference, 2019

Incorporating External Knowledge into Machine Reading for Generative Question Answering.
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019

A Deep Cascade Model for Multi-Document Reading Comprehension.
Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence, 2019

2018
Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering.
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2018


  Loading...