David J. Schwab

Affiliations:
  • Northwestern University, Evanston, Department of Physics


According to our database1, David J. Schwab authored at least 25 papers between 2014 and 2023.

Collaborative distances:
  • Dijkstra number2 of four.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
Random-Energy Secret Sharing via Extreme Synergy.
CoRR, 2023

Generalized Information Bottleneck for Gaussian Variables.
CoRR, 2023

Don't forget the nullspace! Nullspace occupancy as a mechanism for out of distribution failure.
Proceedings of the Eleventh International Conference on Learning Representations, 2023

2022
Information bottleneck theory of high-dimensional regression: relevancy, efficiency and optimality.
Proceedings of the Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, 2022

2021
Leveraging background augmentations to encourage semantic focus in self-supervised contrastive learning.
CoRR, 2021

An Empirical Investigation of Domain Generalization with Empirical Risk Minimizers.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

Perturbation Theory for the Information Bottleneck.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs.
Proceedings of the 9th International Conference on Learning Representations, 2021

2020
Nonequilibrium Statistical Mechanics of Continuous Attractors.
Neural Comput., 2020

Are all negatives created equal in contrastive instance discrimination?
CoRR, 2020

Theory of gating in recurrent neural networks.
CoRR, 2020

Learning Optimal Representations with the Decodable Information Bottleneck.
Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, 2020

Gating creates slow modes and controls phase-space complexity in GRUs and LSTMs.
Proceedings of Mathematical and Scientific Machine Learning, 2020

The Early Phase of Neural Network Training.
Proceedings of the 8th International Conference on Learning Representations, 2020

2019
The Information Bottleneck and Geometric Clustering.
Neural Comput., 2019

How noise affects the Hessian spectrum in overparameterized neural networks.
CoRR, 2019

Mean-field Analysis of Batch Normalization.
CoRR, 2019

2018
A high-bias, low-variance introduction to Machine Learning for physicists.
CoRR, 2018

Learning to Share and Hide Intentions using Information Regularization.
Proceedings of the Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, 2018

2017
The Deterministic Information Bottleneck.
Neural Comput., 2017

2016
Supervised Learning with Quantum-Inspired Tensor Networks.
CoRR, 2016

Comment on "Why does deep and cheap learning work so well?" [arXiv: 1608.08225].
CoRR, 2016

Supervised Learning with Tensor Networks.
Proceedings of the Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, 2016

2014
Quantifying the Role of Population Subdivision in Evolution on Rugged Fitness Landscapes.
PLoS Comput. Biol., 2014

An exact mapping between the Variational Renormalization Group and Deep Learning.
CoRR, 2014


  Loading...