Yijia Shao

According to our database1, Yijia Shao authored at least 14 papers between 2022 and 2024.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
Quiet-STaR: Language Models Can Teach Themselves to Think Before Speaking.
CoRR, 2024

Assisting in Writing Wikipedia-like Articles From Scratch with Large Language Models.
CoRR, 2024

2023
Class Incremental Learning via Likelihood Ratio Based Task Prediction.
CoRR, 2023

Continual Learning of Language Models.
CoRR, 2023

Continual Pre-training of Language Models.
Proceedings of the Eleventh International Conference on Learning Representations, 2023

Class-Incremental Learning based on Label Generation.
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), 2023

AnaMeta: A Table Understanding Dataset of Field Metadata Knowledge Shared by Multi-dimensional Data Analysis Tasks.
Proceedings of the Findings of the Association for Computational Linguistics: ACL 2023, 2023

ACCENT: An Automatic Event Commonsense Evaluation Metric for Open-Domain Dialogue Systems.
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2023

2022
LUNA: Language Understanding with Number Augmentations on Transformers via Number Plugins and Pre-training.
CoRR, 2022

Inferring Tabular Analysis Metadata by Infusing Distribution and Knowledge Information.
CoRR, 2022

CMG: A Class-Mixed Generation Approach to Out-of-Distribution Detection.
Proceedings of the Machine Learning and Knowledge Discovery in Databases, 2022

FormLM: Recommending Creation Ideas for Online Forms by Modelling Semantic and Structural Information.
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022

Adapting a Language Model While Preserving its General Knowledge.
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022

Continual Training of Language Models for Few-Shot Learning.
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022


  Loading...