Yaniv Plan

Orcid: 0000-0002-9930-0980

According to our database1, Yaniv Plan authored at least 35 papers between 2009 and 2023.

Collaborative distances:

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

On csauthors.net:

Bibliography

2023
Model-adapted Fourier sampling for generative compressed sensing.
CoRR, 2023

2022
Sparsity-Free Compressed Sensing With Applications to Generative Priors.
IEEE J. Sel. Areas Inf. Theory, September, 2022

A Coherence Parameter Characterizing Generative Compressed Sensing With Fourier Measurements.
IEEE J. Sel. Areas Inf. Theory, September, 2022

NBIHT: An Efficient Algorithm for 1-Bit Compressed Sensing With Optimal Error Decay Rate.
IEEE Trans. Inf. Theory, 2022

On the Best Choice of Lasso Program Given Data Parameters.
IEEE Trans. Inf. Theory, 2022

2021
Weighted Matrix Completion From Non-Random, Non-Uniform Sampling Patterns.
IEEE Trans. Inf. Theory, 2021

Beyond Independent Measurements: General Compressed Sensing with GNN Application.
CoRR, 2021

PLUGIn: A simple algorithm for inverting generative models with recovery guarantees.
Proceedings of the Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, 2021

2020
Near-optimal Sample Complexity Bounds for Robust Learning of Gaussian Mixtures via Compression Schemes.
J. ACM, 2020

Sub-Gaussian Matrices on Sets: Optimal Tail Dependence and Applications.
CoRR, 2020

2019
Learning Tensors From Partial Binary Measurements.
IEEE Trans. Signal Process., 2019

Tight analyses for non-smooth stochastic gradient descent.
Proceedings of the Conference on Learning Theory, 2019

2018
Optimizing Quantization for Lasso Recovery.
IEEE Signal Process. Lett., 2018

Parameter instability regimes for sparse proximal denoising programs.
CoRR, 2018

Nearly tight sample complexity bounds for learning mixtures of Gaussians via sample compression schemes.
Proceedings of the Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, 2018

2017
Exponential Decay of Reconstruction Error From Binary Measurements of Sparse Signals.
IEEE Trans. Inf. Theory, 2017

Near-optimal sample complexity for convex tensor completion.
CoRR, 2017

2016
The Generalized Lasso With Non-Linear Observations.
IEEE Trans. Inf. Theory, 2016

A simple tool for bounding the deviation of random matrices on geometric sets.
CoRR, 2016

One-Bit Compressive Sensing of Dictionary-Sparse Signals.
CoRR, 2016

Average-case hardness of RIP certification.
Proceedings of the Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, 2016

2015
On the Effective Measure of Dimension in the Analysis Cosparse Model.
IEEE Trans. Inf. Theory, 2015

Random mappings designed for commercial search engines.
CoRR, 2015

2014
Dimension Reduction by Random Hyperplane Tessellations.
Discret. Comput. Geom., 2014

2013
Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach.
IEEE Trans. Inf. Theory, 2013

Lower bounds for quantized matrix completion.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

2012
1-Bit Matrix Completion
CoRR, 2012

One-bit compressed sensing with non-Gaussian measurements
CoRR, 2012

2011
A Probabilistic and RIPless Theory of Compressed Sensing.
IEEE Trans. Inf. Theory, 2011

Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements.
IEEE Trans. Inf. Theory, 2011

One-bit compressed sensing by linear programming
CoRR, 2011

Unicity conditions for low-rank matrix recovery
CoRR, 2011

2010
Matrix Completion With Noise.
Proc. IEEE, 2010

Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements
CoRR, 2010

2009
Accurate low-rank matrix recovery from a small number of linear measurements.
Proceedings of the 47th Annual Allerton Conference on Communication, 2009


  Loading...