Tsachy Weissman

According to our database1, Tsachy Weissman authored at least 228 papers between 1999 and 2020.

Collaborative distances:

Awards

IEEE Fellow

IEEE Fellow 2013, "For contributions to information theory and its application in signal processing".

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Other 

Links

Homepages:

On csauthors.net:

Bibliography

2020
Learning to Bid Optimally and Efficiently in Adversarial First-price Auctions.
CoRR, 2020

Optimal No-regret Learning in Repeated First-price Auctions.
CoRR, 2020

Overcoming High Nanopore Basecaller Error Rates for DNA Storage via Basecaller-Decoder Integration and Convolutional Codes.
Proceedings of the 2020 IEEE International Conference on Acoustics, 2020

LFZip: Lossy Compression of Multivariate Floating-Point Time Series Data via Improved Prediction.
Proceedings of the Data Compression Conference, 2020

2019
Estimating the Fundamental Limits is Easier Than Achieving the Fundamental Limits.
IEEE Trans. Inf. Theory, 2019

Approximate Profile Maximum Likelihood.
J. Mach. Learn. Res., 2019

Minimum Power to Maintain a Nonequilibrium Distribution of a Markov Chain.
CoRR, 2019

Optimal Communication Rates for Zero-Error Distributed Simulation under Blackboard Communication Protocols.
CoRR, 2019

SPRING: a next-generation compressor for FASTQ data.
Bioinform., 2019

Neural Joint Source-Channel Coding.
Proceedings of the 36th International Conference on Machine Learning, 2019

Humans are Still the Best Lossy Image Compressors.
Proceedings of the Data Compression Conference, 2019

Improved read/write cost tradeoff in DNA-based data storage using LDPC codes.
Proceedings of the 57th Annual Allerton Conference on Communication, 2019

2018
Mutual Information, Relative Entropy and Estimation Error in Semi-Martingale Channels.
IEEE Trans. Inf. Theory, 2018

Minimax Estimation of the L<sub>1</sub> Distance.
IEEE Trans. Inf. Theory, 2018

NECST: Neural Joint Source-Channel Coding.
CoRR, 2018

Concentration Inequalities for the Empirical Distribution.
CoRR, 2018

Compression of genomic sequencing reads via hash-based reordering: algorithm and analysis.
Bioinform., 2018

Entropy Rate Estimation for Markov Chains with Large State Space.
Proceedings of the Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, 2018

Minimax Redundancy for Markov Chains with Large State Space.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

On Universal Compression with Constant Random Access.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

Distributed Statistical Estimation of High-Dimensional and Nonparametric Distributions.
Proceedings of the 2018 IEEE International Symposium on Information Theory, 2018

Geometric Lower Bounds for Distributed Parameter Estimation under Communication Constraints.
Proceedings of the Conference On Learning Theory, 2018

Local moment matching: A unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance.
Proceedings of the Conference On Learning Theory, 2018

2017
When is Noisy State Information at the Encoder as Useless as No Information or as Good as Noise-Free State?
IEEE Trans. Inf. Theory, 2017

Relations Between Information and Estimation in Discrete-Time Lévy Channels.
IEEE Trans. Inf. Theory, 2017

Maximum Likelihood Estimation of Functionals of Discrete Distributions.
IEEE Trans. Inf. Theory, 2017

Principles and Applications of Science of Information.
Proc. IEEE, 2017

Optimal rates of entropy estimation over Lipschitz balls.
CoRR, 2017

On Estimation of L<sub>{r}</sub>-Norms in Gaussian White Noise Models.
CoRR, 2017

Bias Correction with Jackknife, Bootstrap, and Taylor Series.
CoRR, 2017

Minimax Estimation of the $L_1$ Distance.
CoRR, 2017

Effect of lossy compression of quality scores on variant calling.
Briefings Bioinform., 2017

Dependence measures bounding the exploration bias for general measurements.
Proceedings of the 2017 IEEE International Symposium on Information Theory, 2017

Compressing Tabular Data via Pairwise Dependencies.
Proceedings of the 2017 Data Compression Conference, 2017

GeneComp, a New Reference-Based Compressor for SAM Files.
Proceedings of the 2017 Data Compression Conference, 2017

2016
Information, Estimation, and Lookahead in the Gaussian Channel.
IEEE Trans. Signal Process., 2016

Compression for Quadratic Similarity Queries: Finite Blocklength and Practical Schemes.
IEEE Trans. Inf. Theory, 2016

Rateless Lossy Compression via the Extremes.
IEEE Trans. Inf. Theory, 2016

Strong Successive Refinability and Rate-Distortion-Complexity Tradeoff.
IEEE Trans. Inf. Theory, 2016

Secure Source Coding With a Public Helper.
IEEE Trans. Inf. Theory, 2016

Distortion Rate Function of Sub-Nyquist Sampled Gaussian Sources.
IEEE Trans. Inf. Theory, 2016

Demystifying ResNet.
CoRR, 2016

Minimax Estimation of KL Divergence between Discrete Distributions.
CoRR, 2016

smallWig: parallel compression of RNA-seq WIG files.
Bioinform., 2016

GTRAC: fast retrieval from compressed collections of genomic variants.
Bioinform., 2016

Comment on: 'ERGC: an efficient referential genome compression algorithm'.
Bioinform., 2016

CROMqs: an infinitesimal successive refinement lossy compressor for the quality scores.
Proceedings of the 2016 IEEE Information Theory Workshop, 2016

Minimax rate-optimal estimation of KL divergence between discrete distributions.
Proceedings of the 2016 International Symposium on Information Theory and Its Applications, 2016

Chained Kullback-Leibler divergences.
Proceedings of the IEEE International Symposium on Information Theory, 2016

Minimax estimation of the L1 distance.
Proceedings of the IEEE International Symposium on Information Theory, 2016

Denoising of Quality Scores for Boosted Inference and Reduced Storage.
Proceedings of the 2016 Data Compression Conference, 2016

A Cluster-Based Approach to Compression of Quality Scores.
Proceedings of the 2016 Data Compression Conference, 2016

Beyond maximum likelihood: Boosting the Chow-Liu algorithm for large alphabets.
Proceedings of the 50th Asilomar Conference on Signals, Systems and Computers, 2016

2015
Minimax Estimation of Functionals of Discrete Distributions.
IEEE Trans. Inf. Theory, 2015

Justification of Logarithmic Loss via the Benefit of Side Information.
IEEE Trans. Inf. Theory, 2015

Compression for Quadratic Similarity Queries.
IEEE Trans. Inf. Theory, 2015

Minimax Estimation of Discrete Distributions Under ℓ<sub>1</sub> Loss.
IEEE Trans. Inf. Theory, 2015

Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs.
IEEE Trans. Inf. Theory, 2015

Network Compression: Worst Case Analysis.
IEEE Trans. Inf. Theory, 2015

DUDE-Seq: Fast Universal Denoising of Nucleotide Sequences.
CoRR, 2015

iDoComp: a compression scheme for assembled genomes.
Bioinform., 2015

QVZ: lossy compression of quality values.
Bioinform., 2015

Universality of logarithmic loss in lossy compression.
Proceedings of the IEEE International Symposium on Information Theory, 2015

Minimax estimation of information measures.
Proceedings of the IEEE International Symposium on Information Theory, 2015

Maximum Likelihood Estimation of information measures.
Proceedings of the IEEE International Symposium on Information Theory, 2015

Minimax estimation of discrete distributions.
Proceedings of the IEEE International Symposium on Information Theory, 2015

Adaptive estimation of Shannon entropy.
Proceedings of the IEEE International Symposium on Information Theory, 2015

Does dirichlet prior smoothing solve the Shannon entropy estimation problem?
Proceedings of the IEEE International Symposium on Information Theory, 2015

Compression for Similarity Identification: Computing the Error Exponent.
Proceedings of the 2015 Data Compression Conference, 2015

2014
Compression With Actions.
IEEE Trans. Inf. Theory, 2014

Capacity of a POST Channel With and Without Feedback.
IEEE Trans. Inf. Theory, 2014

Minimax Filtering Regret via Relations Between Information and Estimation.
IEEE Trans. Inf. Theory, 2014

The Porosity of Additive Noise Channels.
IEEE Trans. Inf. Theory, 2014

Information Measures: The Curious Case of the Binary Alphabet.
IEEE Trans. Inf. Theory, 2014

Multiterminal Source Coding Under Logarithmic Loss.
IEEE Trans. Inf. Theory, 2014

To Feed or Not to Feedback.
IEEE Trans. Inf. Theory, 2014

Aligned genomic data compression via improved modeling.
J. Bioinform. Comput. Biol., 2014

Maximum Likelihood Estimation of Functionals of Discrete Distributions.
CoRR, 2014

Order-Optimal Estimation of Functionals of Discrete Distributions.
CoRR, 2014

Beyond Maximum Likelihood: from Theory to Practice.
CoRR, 2014

Strong successive refinability: Sufficient conditions.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

Relations between information and estimation in scalar Lévy channels.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

Information divergences and the curious case of the binary alphabet.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

Compression for similarity identification: Fundamental limits.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

Compression for quadratic similarity queries via shape-gain quantizers.
Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, June 29, 2014

Compression Schemes for Similarity Queries.
Proceedings of the Data Compression Conference, 2014

2013
Directed Information, Causal Estimation, and Communication in Continuous Time.
IEEE Trans. Inf. Theory, 2013

Achievable Error Exponents in the Gaussian Channel With Rate-Limited Feedback.
IEEE Trans. Inf. Theory, 2013

Universal Estimation of Directed Information.
IEEE Trans. Inf. Theory, 2013

Estimation With a Helper Who Knows the Interference.
IEEE Trans. Inf. Theory, 2013

Multiterminal Source Coding With Action-Dependent Side Information.
IEEE Trans. Inf. Theory, 2013

Real-Time Coding With Limited Lookahead.
IEEE Trans. Inf. Theory, 2013

Successive Refinement With Decoder Cooperation and Its Channel Coding Duals.
IEEE Trans. Inf. Theory, 2013

The Minimal Compression Rate for Similarity Identification.
CoRR, 2013

QualComp: a new lossy compressor for quality scores based on rate distortion theory.
BMC Bioinform., 2013

The human genome contracts again.
Bioinform., 2013

Operational extremality of Gaussianity in network compression, communication, and coding.
Proceedings of the 2013 IEEE Information Theory Workshop, 2013

The role of lookahead in estimation under Gaussian noise.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

Unsupervised learning and universal communication.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

Pointwise relations between information and estimation in the Poisson channel.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

Compression for exact match identification.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

Reliable uncoded communication in the SIMO MAC via low-complexity decoding.
Proceedings of the 2013 IEEE International Symposium on Information Theory, 2013

Quadratic Similarity Queries on Compressed Data.
Proceedings of the 2013 Data Compression Conference, 2013

Efficient similarity queries via lossy compression.
Proceedings of the 51st Annual Allerton Conference on Communication, 2013

Complexity and rate-distortion tradeoff via successive refinement.
Proceedings of the 51st Annual Allerton Conference on Communication, 2013

Distortion rate function of sub-Nyquist sampled Gaussian sources corrupted by noise.
Proceedings of the 51st Annual Allerton Conference on Communication, 2013

Reliable uncoded communication in the underdetermined SIMO MAC with low-complexity decoding.
Proceedings of the 51st Annual Allerton Conference on Communication, 2013

2012
Denoising via MCMC-Based Lossy Compression.
IEEE Trans. Signal Process., 2012

An MCMC Approach to Universal Lossy Compression of Analog Sources.
IEEE Trans. Signal Process., 2012

Pointwise Relations Between Information and Estimation in Gaussian Noise.
IEEE Trans. Inf. Theory, 2012

Cascade and Triangular Source Coding With Side Information at the First Two Nodes.
IEEE Trans. Inf. Theory, 2012

Lossy Compression of Discrete Sources via the Viterbi Algorithm.
IEEE Trans. Inf. Theory, 2012

Cascade, Triangular, and Two-Way Source Coding With Degraded Side Information at the Second User.
IEEE Trans. Inf. Theory, 2012

Mutual Information, Relative Entropy, and Estimation in the Poisson Channel.
IEEE Trans. Inf. Theory, 2012

Block and Sliding-Block Lossy Compression via MCMC.
IEEE Trans. Commun., 2012

Achievable complexity-performance tradeoffs in lossy compression.
Probl. Inf. Transm., 2012

Lossy Compression of Quality Values via Rate Distortion Theory
CoRR, 2012

Worst-case source for distributed compression with quadratic distortion.
Proceedings of the 2012 IEEE Information Theory Workshop, 2012

Reference based genome compression.
Proceedings of the 2012 IEEE Information Theory Workshop, 2012

The degraded broadcast channel with action-dependent states.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

Joint source-channel coding of one random variable over the Poisson channel.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

The porosity of additive noise sequences.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

Universal estimation of directed information via sequential probability assignments.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

Successive refinement with cribbing decoders and its channel coding duals.
Proceedings of the 2012 IEEE International Symposium on Information Theory, 2012

On information, estimation and lookahead.
Proceedings of the 50th Annual Allerton Conference on Communication, 2012

Uncoded transmission in MAC channels achieves arbitrarily small error probability.
Proceedings of the 50th Annual Allerton Conference on Communication, 2012

2011
Source Coding With a Side Information "Vending Machine".
IEEE Trans. Inf. Theory, 2011

Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing.
IEEE Trans. Inf. Theory, 2011

Error Exponents for the Gaussian Channel With Active Noisy Feedback.
IEEE Trans. Inf. Theory, 2011

Probing Capacity.
IEEE Trans. Inf. Theory, 2011

Pointwise Relations between Information and Estimation
CoRR, 2011

Continuous-time directed information and its role in communication.
Proceedings of the 2011 IEEE Information Theory Workshop, 2011

Discrete denoising of heterogeneous two-dimensional data.
Proceedings of the 2011 IEEE International Symposium on Information Theory Proceedings, 2011

Cascade and Triangular source coding with causal side information.
Proceedings of the 2011 IEEE International Symposium on Information Theory Proceedings, 2011

Multi-terminal source coding with action dependent side information.
Proceedings of the 2011 IEEE International Symposium on Information Theory Proceedings, 2011

To feed or not to feed back.
Proceedings of the 2011 IEEE International Symposium on Information Theory Proceedings, 2011

2010
Capacity of Channels With Action-Dependent States.
IEEE Trans. Inf. Theory, 2010

The relationship between causal and noncausal mismatched estimation in continuous-time AWGN channels.
IEEE Trans. Inf. Theory, 2010

Two-way source coding with a helper.
IEEE Trans. Inf. Theory, 2010

A universal scheme for Wyner-Ziv coding of discrete sources.
IEEE Trans. Inf. Theory, 2010

Universal reinforcement learning.
IEEE Trans. Inf. Theory, 2010

Tighter bounds on the capacity of finite-state channels via Markov set-chains.
IEEE Trans. Inf. Theory, 2010

Discrete denoising of heterogenous two-dimensional data
CoRR, 2010

Universal estimation of directed information.
Proceedings of the IEEE International Symposium on Information Theory, 2010

Universal lossless compression-based denoising.
Proceedings of the IEEE International Symposium on Information Theory, 2010

An MCMC Approach to Lossy Compression of Continuous Sources.
Proceedings of the 2010 Data Compression Conference (DCC 2010), 2010

2009
A context quantization approach to universal denoising.
IEEE Trans. Signal Process., 2009

Universal FIR MMSE Filtering.
IEEE Trans. Signal Process., 2009

Finite State Channels With Time-Invariant Deterministic Feedback.
IEEE Trans. Inf. Theory, 2009

Capacity region of the finite-state multiple-access channel with and without feedback.
IEEE Trans. Inf. Theory, 2009

Discrete denoising with shifts.
IEEE Trans. Inf. Theory, 2009

Multiple Description Coding of Discrete Ergodic Sources
CoRR, 2009

Where is the action in information theory.
Proceedings of the 7th International Symposium on Modeling and Optimization in Mobile, 2009

Problems we can solve with a helper.
Proceedings of the 2009 IEEE Information Theory Workshop, 2009

An iterative scheme for near optimal and universal lossy compression.
Proceedings of the 2009 IEEE Information Theory Workshop, 2009

Two-way source coding with a common helper.
Proceedings of the IEEE International Symposium on Information Theory, 2009

Source coding with a side information 'vending machine' at the decoder.
Proceedings of the IEEE International Symposium on Information Theory, 2009

Directed information and causal estimation in continuous time.
Proceedings of the IEEE International Symposium on Information Theory, 2009

An outer bound for side-information scalable source coding with partially cooperating decoders.
Proceedings of the IEEE International Symposium on Information Theory, 2009

An Implementable Scheme for Universal Lossy Compression of Discrete Markov Sources.
Proceedings of the 2009 Data Compression Conference (DCC 2009), 2009

2008
How to Filter an "Individual Sequence With Feedback".
IEEE Trans. Inf. Theory, 2008

The Information Lost in Erasures.
IEEE Trans. Inf. Theory, 2008

Universal Denoising of Discrete-Time Continuous-Amplitude Signals.
IEEE Trans. Inf. Theory, 2008

Capacity of the Trapdoor Channel With Feedback.
IEEE Trans. Inf. Theory, 2008

Universal Filtering Via Hidden Markov Modeling.
IEEE Trans. Inf. Theory, 2008

Coding for Additive White Noise Channels With Feedback Corrupted by Quantization or Bounded Noise.
IEEE Trans. Inf. Theory, 2008

Scanning and Sequential Decision Making for Multidimensional Data - Part II: The Noisy Case.
IEEE Trans. Inf. Theory, 2008

Rate-Distortion with a Limited-Rate Helper to the Encoder and Decoder
CoRR, 2008

New bounds for the capacity region of the Finite-State Multiple Access Channel.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

On directed information and gambling.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

Rate-distortion via Markov chain Monte Carlo.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

Rate-distortion in near-linear time.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

On the capacity of finite-state channels.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

On successive refinement for the Wyner-Ziv problem with partially cooperating decoders.
Proceedings of the 2008 IEEE International Symposium on Information Theory, 2008

Near optimal lossy source coding and compression-based denoising via Markov chain Monte Carlo.
Proceedings of the 42nd Annual Conference on Information Sciences and Systems, 2008

2007
Universal Filtering Via Prediction.
IEEE Trans. Inf. Theory, 2007

Denoising and Filtering Under the Probability of Excess Loss Criterion.
IEEE Trans. Inf. Theory, 2007

Scanning and Sequential Decision Making for Multidimensional Data-Part I: The Noiseless Case.
IEEE Trans. Inf. Theory, 2007

Capacity Region of the Finite-State Multiple Access Channel with and without Feedback
CoRR, 2007

Scanning and Sequential Decision Making for Multi-Dimensional Data - Part II: the Noisy Case
CoRR, 2007

Capacity and Zero-Error Capacity of the Chemical Channel with Feedback.
Proceedings of the IEEE International Symposium on Information Theory, 2007

Competitive On-line Linear FIR MMSE Filtering.
Proceedings of the IEEE International Symposium on Information Theory, 2007

New Bounds on the Rate-Distortion Function of a Binary Markov Source.
Proceedings of the IEEE International Symposium on Information Theory, 2007

A Universal Wyner-Ziv Scheme for Discrete Sources.
Proceedings of the IEEE International Symposium on Information Theory, 2007

Scanning, Filtering and Prediction for Random Fields Corrupted by Gaussian Noise.
Proceedings of the IEEE International Symposium on Information Theory, 2007

The Gaussian Channel with Noisy Feedback.
Proceedings of the IEEE International Symposium on Information Theory, 2007

2006
Algorithms for discrete denoising under channel uncertainty.
IEEE Trans. Signal Process., 2006

Source Coding With Limited-Look-Ahead Side Information at the Decoder.
IEEE Trans. Inf. Theory, 2006

On the optimality of symbol-by-symbol filtering and denoising.
IEEE Trans. Inf. Theory, 2006

Coding for the Feedback Gel'fand-Pinsker Channel and the Feedforward Wyner-Ziv Source.
IEEE Trans. Inf. Theory, 2006

Universal Zero-Delay Joint Source-Channel Coding.
IEEE Trans. Inf. Theory, 2006

On the Entropy Rate of Pattern Processes.
IEEE Trans. Inf. Theory, 2006

Universal Minimax Discrete Denoising Under Channel Uncertainty.
IEEE Trans. Inf. Theory, 2006

Coding for Additive White Noise Channels with Feedback Corrupted by Uniform Quantization or Bounded Noise
CoRR, 2006

Scanning and Sequential Decision Making for Multi-Dimensional Data - Part I: the Noiseless Case
CoRR, 2006

Compound Sequential Decisions Against the Well-Informed Antagonist.
Proceedings of the 2006 IEEE Information Theory Workshop, 2006

Erasure Entropy.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

Capacity of Finite-State Channels with Time-Invariant Deterministic Feedback.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

Source Coding with Limited Side Information Lookahead at the Decoder.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

Universal Scanning and Sequential Decision Making for Multidimensional Data.
Proceedings of the Proceedings 2006 IEEE International Symposium on Information Theory, 2006

Universal Denoising of Continuous Amplitude Signals with Applications to Images.
Proceedings of the International Conference on Image Processing, 2006

2005
Universal discrete denoising: known channel.
IEEE Trans. Inf. Theory, 2005

The empirical distribution of rate-constrained source codes.
IEEE Trans. Inf. Theory, 2005

On causal source codes with side information.
IEEE Trans. Inf. Theory, 2005

Universal denoising for the finite-input general-output channel.
IEEE Trans. Inf. Theory, 2005

Discrete denoising for channels with memory.
Commun. Inf. Syst., 2005

Multi-directional context sets with applications to universal denoising and compression.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

Approximations for the entropy rate of a hidden Markov process.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

Discrete universal filtering via hidden Markov modelling.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

On the relationship between process and pattern entropy rate.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

A universal scheme for learning.
Proceedings of the 2005 IEEE International Symposium on Information Theory, 2005

2004
Universally Attainable Error Exponents for Rate-Distortion Coding of Noisy Sources.
IEEE Trans. Inf. Theory, 2004

Efficient pruning of bi-directional context trees with applications to universal denoising and compression.
Proceedings of the 2004 IEEE Information Theory Workshop, 2004

New bounds on the entropy rate of hidden Markov processes.
Proceedings of the 2004 IEEE Information Theory Workshop, 2004

Channel decoding of systematically encoded unknown redundant sources.
Proceedings of the 2004 IEEE International Symposium on Information Theory, 2004

Universal minimax binary image denoising under channel uncertainty.
Proceedings of the 2004 International Conference on Image Processing, 2004

Discrete Universal Filtering Through Incremental Parsing.
Proceedings of the 2004 Data Compression Conference (DCC 2004), 2004

2003
On competitive prediction and its relation to rate-distortion theory.
IEEE Trans. Inf. Theory, 2003

Scanning and prediction in multidimensional data arrays.
IEEE Trans. Inf. Theory, 2003

The minimax distortion redundancy in noisy source coding.
IEEE Trans. Inf. Theory, 2003

A discrete universal denoiser and its application to binary images.
Proceedings of the 2003 International Conference on Image Processing, 2003

2002
On limited-delay lossy coding and filtering of individual sequences.
IEEE Trans. Inf. Theory, 2002

Tradeoffs between the excess-code-length exponent and the excess-distortion exponent in lossy source coding.
IEEE Trans. Inf. Theory, 2002

Universal discrete denoising.
Proceedings of the 2002 IEEE Information Theory Workshop, 2002

2001
Twofold universal prediction schemes for achieving the finite-state predictability of a noisy individual binary sequence.
IEEE Trans. Inf. Theory, 2001

Universal prediction of individual binary sequences in the presence of noise.
IEEE Trans. Inf. Theory, 2001

1999
On Prediction of Individual Sequences Relative to a Set of Experts in the Presence of Noise.
Proceedings of the Twelfth Annual Conference on Computational Learning Theory, 1999


  Loading...