Aditya Gupta

Orcid: 0000-0002-3730-8944

Affiliations:
  • xAI, Paolo Alto, CA, USA
  • Essential AI, San Francisco, CA, USA (former)
  • Google Inc., Mountain View, CA, USA (former)
  • University of Texas at Austin, TX, USA (former)
  • Indian Institute of Technology Guwahati, Department of Mathematics, Guwahati, India (former)


According to our database1, Aditya Gupta authored at least 15 papers between 2017 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
AutoMix: Automatically Mixing Language Models.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

2023
Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models.
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Trans. Mach. Learn. Res., 2023

How FaR Are Large Language Models From Agents with Theory-of-Mind?
CoRR, 2023

PRESTO: A Multilingual Dataset for Parsing Realistic Task-Oriented Dialogs.
CoRR, 2023

Can Sequence-to-Sequence Transformers Naturally Understand Sequential Instructions?
Proceedings of the The 12th Joint Conference on Lexical and Computational Semantics, 2023

PRESTO: A Multilingual Dataset for Parsing Realistic Task-Oriented Dialogs.
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023

Efficient Encoders for Streaming Sequence Tagging.
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, 2023

2022
Improving Top-K Decoding for Non-Autoregressive Semantic Parsing via Intent Conditioning.
Proceedings of the 29th International Conference on Computational Linguistics, 2022

TableFormer: Robust Transformer Modeling for Table-Text Encoding.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2022

2021
TIMEDIAL: Temporal Commonsense Reasoning in Dialog.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

Disfl-QA: A Benchmark Dataset for Understanding Disfluencies in Question Answering.
Proceedings of the Findings of the Association for Computational Linguistics: ACL/IJCNLP 2021, 2021

2019
Effective Use of Transformer Networks for Entity Tracking.
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019

Tracking Discrete and Continuous Entity State for Process Understanding.
Proceedings of the Third Workshop on Structured Prediction for NLP@NAACL-HLT 2019, 2019

2018
Uncertain fuzzy self-organization based clustering: interval type-2 fuzzy approach to adaptive resonance theory.
Inf. Sci., 2018

2017
Principal component analysis approach in selecting type-1 and type-2 fuzzy membership functions for high-dimensional data.
Proceedings of the Joint 17th World Congress of International Fuzzy Systems Association and 9th International Conference on Soft Computing and Intelligent Systems, 2017


  Loading...