Andreas Kirsch

Orcid: 0000-0001-8244-7700

Affiliations:
  • University of Oxford, UK


According to our database1, Andreas Kirsch authored at least 22 papers between 2017 and 2024.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2024
Advancing Deep Active Learning & Data Subset Selection: Unifying Principles with Information-Theory Intuitions.
CoRR, 2024

2023
Does "Deep Learning on a Data Diet" reproduce? Overall yes, but GraNd at Initialization does not.
CoRR, 2023

Black-Box Batch Active Learning for Regression.
CoRR, 2023

Speeding Up BatchBALD: A k-BALD Family of Approximations for Active Learning.
CoRR, 2023

Deep Deterministic Uncertainty: A New Simple Baseline.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023

Prediction-Oriented Bayesian Active Learning.
Proceedings of the International Conference on Artificial Intelligence and Statistics, 2023

2022
Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities.
Trans. Mach. Learn. Res., 2022

A Note on "Assessing Generalization of SGD via Disagreement".
Trans. Mach. Learn. Res., 2022

Unifying Approaches in Data Subset Selection via Fisher Information and Information-Theoretic Quantities.
CoRR, 2022

Plex: Towards Reliability using Pretrained Large Model Extensions.
CoRR, 2022

Marginal and Joint Cross-Entropies & Predictives for Online Bayesian Inference, Active Learning, and Active Sampling.
CoRR, 2022

Prioritized Training on Points that are Learnable, Worth Learning, and not yet Learnt.
Proceedings of the International Conference on Machine Learning, 2022

2021
Prioritized training on points that are learnable, worth learning, and not yet learned.
CoRR, 2021

A Practical & Unified Notation for Information-Theoretic Quantities in ML.
CoRR, 2021

A Simple Baseline for Batch Active Learning with Stochastic Acquisition Functions.
CoRR, 2021

Active Learning under Pool Set Distribution Shift and Noisy Data.
CoRR, 2021

Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty.
CoRR, 2021

PowerEvaluationBALD: Efficient Evaluation-Oriented Deep (Bayesian) Active Learning with Stochastic Acquisition Functions.
CoRR, 2021

Causal-BALD: Deep Bayesian Active Learning of Outcomes to Infer Treatment-Effects from Observational Data.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

2020
Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning.
CoRR, 2020

2019
BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning.
Proceedings of the Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, 2019

2017
MDP environments for the OpenAI Gym.
CoRR, 2017


  Loading...