Zhichao Duan

Affiliations:
  • Tsinghua University, Beijing, China


According to our database1, Zhichao Duan authored at least 10 papers between 2021 and 2025.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of five.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2025
COMM:Concentrated Margin Maximization for Robust Document-Level Relation Extraction.
CoRR, March, 2025

Negative Matters: Multi-Granularity Hard-Negative Synthesis and Anchor-Token-Aware Pooling for Enhanced Text Embeddings.
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2025

FocusLLM: Precise Understanding of Long Context by Dynamic Condensing.
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2025

COMM: Concentrated Margin Maximization for Robust Document-Level Relation Extraction.
Proceedings of the AAAI-25, Sponsored by the Association for the Advancement of Artificial Intelligence, February 25, 2025

2024
FocusLLM: Scaling LLM's Context by Parallel Decoding.
CoRR, 2024

FlexKBQA: A Flexible LLM-Powered Framework for Few-Shot Knowledge Base Question Answering.
Proceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence, 2024

2023
Toward a Unified Framework for Unsupervised Complex Tabular Reasoning.
Proceedings of the 39th IEEE International Conference on Data Engineering, 2023

2022
Not Just Plain Text! Fuel Document-Level Relation Extraction with Explicit Syntax Refinement and Subsentence Modeling.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2022, 2022

2021
Jointly Modeling Fact Triples and Text Information for Knowledge Base Completion.
Proceedings of the 2021 IEEE International Conference on Big Knowledge, 2021

Bridging the Language Gap: Knowledge Injected Multilingual Question Answering.
Proceedings of the 2021 IEEE International Conference on Big Knowledge, 2021


  Loading...