Chi-Min Chan

According to our database1, Chi-Min Chan authored at least 9 papers between 2021 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
Parameter-efficient fine-tuning of large-scale pre-trained language models.
Nat. Mac. Intell., March, 2023

AgentVerse: Facilitating Multi-Agent Collaboration and Exploring Emergent Behaviors in Agents.
CoRR, 2023

ChatEval: Towards Better LLM-based Evaluators through Multi-Agent Debate.
CoRR, 2023

Arbitrary Few Parameters are Good Enough for Adapting Large-scale Pre-trained Language Models.
CoRR, 2023

Exploring the Impact of Model Scaling on Parameter-Efficient Tuning.
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023

Plug-and-Play Document Modules for Pre-trained Models.
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2023

2022
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models.
CoRR, 2022

On Transferability of Prompt Tuning for Natural Language Processing.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2022

2021
On Transferability of Prompt Tuning for Natural Language Understanding.
CoRR, 2021


  Loading...