[go: up one dir, main page]

Analyzing Wrap-Up Effects through an Information-Theoretic Lens

Clara Meister, Tiago Pimentel, Thomas Clark, Ryan Cotterell, Roger Levy


Abstract
Numerous analyses of reading time (RT) data have been undertaken in the effort to learn more about the internal processes that occur during reading comprehension. However, data measured on words at the end of a sentence–or even clause–is often omitted due to the confounding factors introduced by so-called “wrap-up effects,” which manifests as a skewed distribution of RTs for these words. Consequently, the understanding of the cognitive processes that might be involved in these effects is limited. In this work, we attempt to learn more about these processes by looking for the existence–or absence–of a link between wrap-up effects and information theoretic quantities, such as word and context information content. We find that the information distribution of prior context is often predictive of sentence- and clause-final RTs (while not of sentence-medial RTs), which lends support to several prior hypotheses about the processes involved in wrap-up effects.
Anthology ID:
2022.acl-short.3
Original:
2022.acl-short.3v1
Version 2:
2022.acl-short.3v2
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20–28
Language:
URL:
https://aclanthology.org/2022.acl-short.3
DOI:
10.18653/v1/2022.acl-short.3
Bibkey:
Cite (ACL):
Clara Meister, Tiago Pimentel, Thomas Clark, Ryan Cotterell, and Roger Levy. 2022. Analyzing Wrap-Up Effects through an Information-Theoretic Lens. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 20–28, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Analyzing Wrap-Up Effects through an Information-Theoretic Lens (Meister et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.3.pdf
Video:
 https://aclanthology.org/2022.acl-short.3.mp4