Yue Dong

Orcid: 0000-0003-2161-8566

Affiliations:
  • University of California, Riverside, CA, USA
  • McGill University, School of Computer Science, Montreal, QC, Canada (former)
  • Montreal Institute of Learning Algorithms, QC, Canada (former)
  • University of Ottawa, Department of Mathematics and Statistics, ON, Canada (former)


According to our database1, Yue Dong authored at least 23 papers between 2016 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Mechanisms of non-factual hallucinations in language models.
CoRR, 2024

2023
Fast Text Generation with Text-Editing Models.
Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2023

2022
Text Generation with Text-Editing Models.
CoRR, 2022

Faithful to the Document or to the World? Mitigating Hallucinations via Entity-Linked Knowledge in Abstractive Summarization.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2022, 2022

Learning with Rejection for Abstractive Text Summarization.
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022

Hallucinated but Factual! Inspecting the Factuality of Hallucinations in Abstractive Summarization.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2022

2021
Inspecting the Factuality of Hallucinated Entities in Abstractive Summarization.
CoRR, 2021

On-the-Fly Attention Modularization for Neural Generation.
CoRR, 2021

Discourse-Aware Unsupervised Summarization for Long Scientific Documents.
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, 2021

Bringing Structure into Summaries: a Faceted Summarization Dataset for Long Scientific Documents.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

On-the-Fly Attention Modulation for Neural Generation.
Proceedings of the Findings of the Association for Computational Linguistics: ACL/IJCNLP 2021, 2021

2020
HipoRank: Incorporating Hierarchical and Positional Information into Graph-based Unsupervised Long Document Extractive Summarization.
CoRR, 2020

Multi-XScience: A Large-scale Dataset for Extreme Multi-document Summarization of Scientific Articles.
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020

Multi-Fact Correction in Abstractive Text Summarization.
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020

Factual Error Correction for Abstractive Summarization Models.
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020

2019
Countering the Effects of Lead Bias in News Summarization via Multi-Stage Training and Auxiliary Losses.
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019

EditNTS: An Neural Programmer-Interpreter Model for Sentence Simplification through Explicit Editing.
Proceedings of the 57th Conference of the Association for Computational Linguistics, 2019

Learning Multi-Task Communication with Message Passing for Sequence Learning.
Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence, 2019

2018
Multi-task Learning over Graph Structures.
CoRR, 2018

Threaded ensembles of autoencoders for stream learning.
Comput. Intell., 2018

A Hierarchical Neural Attention-based Text Classifier.
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, October 31, 2018

BanditSum: Extractive Summarization as a Contextual Bandit.
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, October 31, 2018

2016
Threaded Ensembles of Supervised and Unsupervised Neural Networks for Stream Learning.
Proceedings of the Advances in Artificial Intelligence, 2016


  Loading...