Lukasz Debowski

Orcid: 0000-0001-7136-5283

According to our database1, Lukasz Debowski authored at least 44 papers between 2001 and 2023.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2023
Universal Densities Exist for Every Finite Reference Measure.
IEEE Trans. Inf. Theory, August, 2023

Corrections to "Universal Densities Exist for Every Finite Reference Measure".
CoRR, 2023

Corrections of Zipf's and Heaps' Laws Derived from Hapax Rate Models.
CoRR, 2023

Recurrence and repetition times in the case of a stretched exponential growth.
CoRR, 2023

A Simplistic Model of Neural Scaling Laws: Multiperiodic Santa Fe Processes.
CoRR, 2023

2022
There Are Fewer Facts Than Words: Communication With A Growing Complexity.
CoRR, 2022

Local Grammar-Based Coding Revisited.
CoRR, 2022

Universal coding and Prediction on Ergodic Random Points.
Bull. Symb. Log., 2022

2021
A Refutation of Finite-State Language Models through Zipf's Law for Factual Knowledge.
Entropy, 2021

2020
Information Theory and Language.
Entropy, 2020

Approximating Information Measures for Fields.
Entropy, 2020

Bounds for Algorithmic Mutual Information and a Unifilar Order Estimator.
CoRR, 2020

Universal Coding and Prediction on Martin-Löf Random Points.
CoRR, 2020

On a Class of Markov Order Estimators Based on PPM and Other Universal Codes.
CoRR, 2020

2018
Maximal Repetition and Zero Entropy Rate.
IEEE Trans. Inf. Theory, 2018

Is Natural Language a Perigraphic Process? The Theorem about Facts and Words Revisited.
Entropy, 2018

2017
Regular Hilberg Processes: An Example of Processes With a Vanishing Entropy Rate.
IEEE Trans. Inf. Theory, 2017

Is Natural Language Strongly Nonergodic? A Stronger Theorem about Facts and Words.
CoRR, 2017

2016
Estimation of Entropy from Subword Complexity.
Proceedings of the Challenges in Computational Statistics and Data Mining, 2016

Entropy Rate Estimates for Natural Language - A New Extrapolation of Compressed Large-Scale Corpora.
Entropy, 2016

Consistency of the plug-in estimator of the entropy rate for ergodic processes.
Proceedings of the IEEE International Symposium on Information Theory, 2016

Upper Bound of Entropy Rate Revisited --A New Extrapolation of Compressed Large-Scale Corpora--.
Proceedings of the Workshop on Computational Linguistics for Linguistic Complexity, 2016

2015
Hilberg Exponents: New Measures of Long Memory in the Process.
IEEE Trans. Inf. Theory, 2015

A Preadapted Universal Switch Distribution for Testing Hilberg's Conjecture.
IEEE Trans. Inf. Theory, 2015

The Relaxed Hilberg Conjecture: A Review and New Experimental Support.
J. Quant. Linguistics, 2015

Maximal Repetitions in Written Texts: Finite Energy Hypothesis vs. Strong Hilberg Conjecture.
Entropy, 2015

A New Universal Code Helps to Distinguish Natural Language from Random Texts.
Proceedings of the Recent Contributions to Quantitative Linguistics, 2015

2013
Constant conditional entropy and related hypotheses
CoRR, 2013

Constant entropy rate and related hypotheses versus real language.
Proceedings of the 35th Annual Meeting of the Cognitive Science Society, 2013

2012
Mixing, Ergodic, and Nonergodic Processes With Rapidly Growing Information Between Blocks.
IEEE Trans. Inf. Theory, 2012

On Hidden Markov Processes with Infinite Excess Entropy
CoRR, 2012

2011
On the Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts.
IEEE Trans. Inf. Theory, 2011

2010
A link between the number of set phrases in a text and the number of described facts.
Proceedings of the Text and Language. Structures - Functions - Interrelations, 2010

2009
Valence extraction using EM selection and co-occurrence matrices.
Lang. Resour. Evaluation, 2009

The Redundancy of a Computable Code on a Noncomputable Distribution
CoRR, 2009

Computable Bayesian Compression for Uniformly Discretizable Statistical Models.
Proceedings of the Algorithmic Learning Theory, 20th International Conference, 2009

2007
On vocabulary size of grammar-based codes.
Proceedings of the IEEE International Symposium on Information Theory, 2007

Menzerath's law for the smallest grammars.
Proceedings of the Exact Methods in the Study of Language and Text, 2007

2006
On Hilberg's law and its links with Guiraud's law.
J. Quant. Linguistics, 2006

2004
A Search Tool for Corpora with Positional Tagsets and Ambiguities.
Proceedings of the Fourth International Conference on Language Resources and Evaluation, 2004

Trigram morphosyntactic tagger for Polish.
Proceedings of the Intelligent Information Processing and Web Mining, 2004

2002
Testing the Limits - Adding a New Language to an MT System.
Prague Bull. Math. Linguistics, 2002

Zipf's law against the text size: a half-rational model.
Glottometrics, 2002

2001
A Revision of Coding Theory for Learning from Language.
Proceedings of the joint meeting of the 6th Conference on Formal Grammar (FG) and the 7th Conference on Mathematics of Language (MOL), 2001


  Loading...