David Holzmüller

Orcid: 0000-0002-9443-0049

According to our database1, David Holzmüller authored at least 19 papers between 2017 and 2025.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2025
Beyond ReLU: How Activations Affect Neural Kernels and Random Wide Networks.
CoRR, June, 2025

TabArena: A Living Benchmark for Machine Learning on Tabular Data.
CoRR, June, 2025

LOGLO-FNO: Efficient Learning of Local and Global Features in Fourier Neural Operators.
CoRR, April, 2025

TabICL: A Tabular Foundation Model for In-Context Learning on Large Data.
CoRR, February, 2025

Rethinking Early Stopping: Refine, Then Calibrate.
CoRR, January, 2025

Active Learning for Neural PDE Solvers.
Proceedings of the Thirteenth International Conference on Learning Representations, 2025

2024
Better by default: Strong pre-tuned MLPs and boosted trees on tabular data.
Proceedings of the Advances in Neural Information Processing Systems 38: Annual Conference on Neural Information Processing Systems 2024, 2024

2023
Code and Data for: A Framework and Benchmark for Deep Batch Active Learning for Regression [arXiv v3].
Dataset, April, 2023

Regression from linear models to neural networks: double descent, active learning, and sampling.
PhD thesis, 2023

A Framework and Benchmark for Deep Batch Active Learning for Regression.
J. Mach. Learn. Res., 2023

Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation.
CoRR, 2023

Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension.
Proceedings of the Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, 2023

2022
Code for: Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent.
Dataset, June, 2022

Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent.
J. Mach. Learn. Res., 2022

2021
Code for: Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments.
Dataset, October, 2021

On the Universality of the Double Descent Peak in Ridgeless Regression.
Proceedings of the 9th International Conference on Learning Representations, 2021

2020
Muscles Reduce Neuronal Information Load: Quantification of Control Effort in Biological vs. Robotic Pointing and Walking.
Frontiers Robotics AI, 2020

2017
Improved Approximation Schemes for the Restricted Shortest Path Problem.
CoRR, 2017

Efficient Neighbor-Finding on Space-Filling Curves.
CoRR, 2017


  Loading...