James Melbourne

Orcid: 0000-0002-1263-0961

According to our database1, James Melbourne authored at least 35 papers between 2016 and 2024.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2024
On a Conjecture of Feige for Discrete Log-Concave Distributions.
SIAM J. Discret. Math., March, 2024

Geometric and Functional Inequalities for Log-Concave Probability Sequences.
Discret. Comput. Geom., March, 2024

2023
Causal Structure Recovery of Linear Dynamical Systems: An FFT based Approach.
CoRR, 2023

Minimum entropy of a log-concave variable for fixed variance.
CoRR, 2023

2022
The Differential Entropy of Mixtures: New Bounds and Applications.
IEEE Trans. Inf. Theory, 2022

Concentration functions and entropy bounds for discrete log-concave distributions.
Comb. Probab. Comput., 2022

Moments, Concentration, and Entropy of Log-Concave Distributions.
CoRR, 2022

Change detection using an iterative algorithm with guarantees.
Autom., 2022

2021
Reversal of Rényi Entropy Inequalities Under Log-Concavity.
IEEE Trans. Inf. Theory, 2021

A discrete complement of Lyapunov's inequality and its information theoretic consequences.
CoRR, 2021

Quantitative form of Ball's Cube slicing in R<sup>n</sup> and equality cases in the min-entropy power inequality.
CoRR, 2021

Bernoulli sums and Rényi entropy inequalities.
CoRR, 2021

2020
Strongly Convex Divergences.
Entropy, 2020

Entropy and Information Inequalities.
Entropy, 2020

Convex Decreasing Algorithms: Distributed Synthesis and Finite-time Termination in Higher Dimension.
CoRR, 2020

Reversals of Rényi Entropy Inequalities under Log-Concavity.
CoRR, 2020

On the Rényi Entropy of Log-Concave Sequences.
Proceedings of the IEEE International Symposium on Information Theory, 2020

2019
On the Entropy Power Inequality for the Rényi Entropy of Order [0, 1].
IEEE Trans. Inf. Theory, 2019

Rényi Entropy Power Inequalities for s-concave Densities.
Proceedings of the IEEE International Symposium on Information Theory, 2019

Entropic Central Limit Theorem for Rényi Entropy.
Proceedings of the IEEE International Symposium on Information Theory, 2019

Relationships between certain f -divergences.
Proceedings of the 57th Annual Allerton Conference on Communication, 2019

2018
Analysis of Heat Dissipation and Reliability in Information Erasure: A Gaussian Mixture Approach.
Entropy, 2018

An Exact Upper Bound on the L<sup>p</sup> Lebesgue Constant and The ∞-Rényi Entropy Power Inequality for Integer Valued Random Variables.
CoRR, 2018

Rearrangement and Prekopa-Leindler type inequalities.
CoRR, 2018

The deficit in an entropic inequality.
CoRR, 2018

Error Bounds on a Mixed Entropy Inequality.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

A Renyi Entropy Power Inequality for Log-Concave Vectors and Parameters in [0, 1].
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

Further Investigations of the Maximum Entropy of the Sum of Two Dependent Random Variables.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

Realizing Information Erasure in Finite Time.
Proceedings of the 57th IEEE Conference on Decision and Control, 2018

Learning and Estimation of Single Molecule Behavior.
Proceedings of the 2018 Annual American Control Conference, 2018

Rearrangements and information theoretic inequalities.
Proceedings of the 56th Annual Allerton Conference on Communication, 2018

2017
Infinity-Rényi entropy power inequalities.
Proceedings of the 2017 IEEE International Symposium on Information Theory, 2017

A min-entropy power inequality for groups.
Proceedings of the 2017 IEEE International Symposium on Information Theory, 2017

2016
Forward and Reverse Entropy Power Inequalities in Convex Geometry.
CoRR, 2016

Reverse entropy power inequalities for s-concave densities.
Proceedings of the IEEE International Symposium on Information Theory, 2016


  Loading...