Joonghyuk Hahn

Orcid: 0009-0000-5890-4916

According to our database1, Joonghyuk Hahn authored at least 14 papers between 2021 and 2025.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
AmpleHate: Amplifying the Attention for Versatile Implicit Hate Detection.
CoRR, May, 2025

URECA: The Chain of Two Minimum Set Cover Problems exists behind Adaptation to Shifts in Semantic Code Search.
CoRR, February, 2025

Advanced code time complexity prediction approach using contrastive learning.
Eng. Appl. Artif. Intell., 2025

TCProF:Time-Complexity Prediction SSL Framework.
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies, 2025

2024
On the Decidability of Infix Inclusion Problem.
Theory Comput. Syst., June, 2024

CodeComplex: A Time-Complexity Dataset for Bilingual Source Codes.
CoRR, 2024

Universal Rewriting Rules for the Parikh Matrix Injectivity Problem.
Proceedings of the Developments in Language Theory - 28th International Conference, 2024

SuperST: Superficial Self-Training for Few-Shot Text Classification.
Proceedings of the 2024 Joint International Conference on Computational Linguistics, 2024

2023
M-equivalence of Parikh Matrix over a Ternary Alphabet.
Proceedings of the Implementation and Application of Automata, 2023

ATHENA: Mathematical Reasoning with Thought Expansion.
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023

GDA: Grammar-based Data Augmentation for Text Classification using Slot Information.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2023, 2023

2022
Boosting Code Summarization by Embedding Code Structures.
Proceedings of the 29th International Conference on Computational Linguistics, 2022

2021
Self-Training using Rules of Grammar for Few-Shot NLU.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2021, 2021

Most Pseudo-copy Languages Are Not Context-Free.
Proceedings of the Computing and Combinatorics - 27th International Conference, 2021


  Loading...