Arnaud Joly

Affiliations:
  • Université de Liège, Belgium


According to our database1, Arnaud Joly authored at least 18 papers between 2012 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
BASE TTS: Lessons from building a billion-parameter Text-to-Speech model on 100K hours of data.
CoRR, 2024

2023
Controllable Emphasis with zero data for text-to-speech.
CoRR, 2023

2022
Simple and Effective Multi-sentence TTS with Expressive and Coherent Prosody.
CoRR, 2022

Simple and Effective Multi-sentence TTS with Expressive and Coherent Prosody.
Proceedings of the Interspeech 2022, 2022

Distribution Augmentation for Low-Resource Expressive Text-To-Speech.
Proceedings of the IEEE International Conference on Acoustics, 2022

2021
Multi-Scale Spectrogram Modelling for Neural Text-to-Speech.
CoRR, 2021

A Learned Conditional Prior for the VAE Acoustic Space of a TTS System.
Proceedings of the Interspeech 2021, 22nd Annual Conference of the International Speech Communication Association, Brno, Czechia, 30 August, 2021

Prosodic Representation Learning and Contextual Sampling for Neural Text-to-Speech.
Proceedings of the IEEE International Conference on Acoustics, 2021

Camp: A Two-Stage Approach to Modelling Prosody in Context.
Proceedings of the IEEE International Conference on Acoustics, 2021

2020
CopyCat: Many-to-Many Fine-Grained Prosody Transfer for Neural Text-to-Speech.
Proceedings of the Interspeech 2020, 2020

2019
Gradient tree boosting with random output projections for multi-label classification and multi-output regression.
CoRR, 2019

2017
Exploiting random projections and sparsity with random forests and gradient boosting methods - Application to multi-label and multi-output learning, random forest model compression and leveraging input sparsity.
PhD thesis, 2017

Exploiting random projections and sparsity with random forests and gradient boosting methods - Application to multi-label and multi-output learning, random forest model compression and leveraging input sparsity.
CoRR, 2017

Globally Induced Forest: A Prepruning Compression Scheme.
Proceedings of the 34th International Conference on Machine Learning, 2017

2014
Random Forests with Random Projections of the Output Space for High Dimensional Multi-label Classification.
Proceedings of the Machine Learning and Knowledge Discovery in Databases, 2014

Simple Connectome Inference from Partial Correlation Statistics in Calcium Imaging.
Proceedings of the Neural Connectomics Workshop at ECML 2014, 2014

2013
API design for machine learning software: experiences from the scikit-learn project.
CoRR, 2013

2012
L1-based compression of random forest models.
Proceedings of the 20th European Symposium on Artificial Neural Networks, 2012


  Loading...