Joseph Jennings

Orcid: 0009-0009-2370-2834

According to our database1, Joseph Jennings authored at least 11 papers between 2024 and 2025.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
NVIDIA Nemotron Nano 2: An Accurate and Efficient Hybrid Mamba-Transformer Reasoning Model.
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
CoRR, August, 2025

Llama-Nemotron: Efficient Reasoning Models.
, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
CoRR, May, 2025

Nemotron-H: A Family of Accurate and Efficient Hybrid Mamba-Transformer Models.
CoRR, April, 2025

Training Video Foundation Models with NVIDIA NeMo.
CoRR, March, 2025

Éclair - Extracting Content and Layout with Integrated Reading Order for Documents.
CoRR, February, 2025

Enhanced Soups for Graph Neural Networks.
Proceedings of the 2025 IEEE International Parallel and Distributed Processing Symposium, 2025

Nemotron-CC: Transforming Common Crawl into a Refined Long-Horizon Pretraining Dataset.
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 2025

2024
Data, Data Everywhere: A Guide for Pretraining Dataset Construction.
CoRR, 2024

Nemotron-4 340B Technical Report.
CoRR, 2024

Nemotron-4 15B Technical Report.
CoRR, 2024

Data, Data Everywhere: A Guide for Pretraining Dataset Construction.
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024


  Loading...