Marten van Schijndel

Orcid: 0000-0002-9858-5881

Affiliations:
  • Cornell University, Ithaka, NY, USA


According to our database1, Marten van Schijndel authored at least 29 papers between 2012 and 2023.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2023
Linguistic Compression in Single-Sentence Human-Written Summaries.
Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2023, 2023

2022
Dual Mechanism Priming Effects in Hindi Word Order.
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing, 2022

Discourse Context Predictability Effects in Hindi Word Order.
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 2022

2021
Single-Stage Prediction Models Do Not Explain the Magnitude of Syntactic Disambiguation Difficulty.
Cogn. Sci., 2021

All Bark and No Bite: Rogue Dimensions in Transformer Language Models Obscure Representational Quality.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, 2021

To Point or Not to Point: Understanding How Abstractive Summarizers Paraphrase Text.
Proceedings of the Findings of the Association for Computational Linguistics: ACL/IJCNLP 2021, 2021

Uncovering Constraint-Based Behavior in Neural Models via Targeted Fine-Tuning.
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 2021

2020
Discourse structure interacts with reference but not syntax in neural language models.
Proceedings of the 24th Conference on Computational Natural Language Learning, 2020

Filler-gaps that neural networks fail to generalize.
Proceedings of the 24th Conference on Computational Natural Language Learning, 2020

Interaction with Context During Recurrent Neural Network Sentence Processing.
Proceedings of the 42th Annual Meeting of the Cognitive Science Society, 2020

Recurrent Neural Network Language Models Always Learn English-Like Relative Clause Attachment.
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020

2019
Quantity doesn't buy quality syntax with neural language models.
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019

Using Priming to Uncover the Organization of Syntactic Representations in Neural Language Models.
Proceedings of the 23rd Conference on Computational Natural Language Learning, 2019

2018
Can Entropy Explain Successor Surprisal Effects in Reading?
CoRR, 2018

A Neural Model of Adaptation in Reading.
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, October 31, 2018

Modeling garden path effects without explicit hierarchical syntax.
Proceedings of the 40th Annual Meeting of the Cognitive Science Society, 2018

2017
Approximations of Predictive Entropy Correlate with Reading Times.
Proceedings of the 39th Annual Meeting of the Cognitive Science Society, 2017

2016
Memory access during incremental sentence processing causes reading time latency.
Proceedings of the Workshop on Computational Linguistics for Linguistic Complexity, 2016

Addressing surprisal deficiencies in reading time models.
Proceedings of the Workshop on Computational Linguistics for Linguistic Complexity, 2016

2015
AZMAT: Sentence Similarity Using Associative Matrices.
Proceedings of the 9th International Workshop on Semantic Evaluation, 2015

Hierarchic syntax improves reading time prediction.
Proceedings of the NAACL HLT 2015, The 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Denver, Colorado, USA, May 31, 2015

Evidence of syntactic working memory usage in MEG data.
Proceedings of the 6th Workshop on Cognitive Modeling and Computational Linguistics, 2015

2014
Frequency effects in the processing of unbounded dependencies.
Proceedings of the 36th Annual Meeting of the Cognitive Science Society, 2014

Bootstrapping into Filler-Gap: An Acquisition Story.
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, 2014

2013
A Model of Language Processing as Hierarchic Sequential Prediction.
Top. Cogn. Sci., 2013

An Analysis of Frequency- and Memory-Based Processing Costs.
Proceedings of the Human Language Technologies: Conference of the North American Chapter of the Association of Computational Linguistics, 2013

An Analysis of Memory-based Processing Costs using Incremental Deep Syntactic Dependency Parsing.
Proceedings of the Fourth Annual Workshop on Cognitive Modeling and Computational Linguistics, 2013

2012
Accurate Unbounded Dependency Recovery using Generalized Categorial Grammars.
Proceedings of the COLING 2012, 2012

Connectionist-Inspired Incremental PCFG Parsing.
Proceedings of the 3rd Workshop on Cognitive Modeling and Computational Linguistics, 2012


  Loading...