Ollie Sayeed
Department of Linguistics
University of Pennsylvania
Why most languages aren’t chaotic evil
Grammarians in the classical tradition (e.g. Priscian 500 CE) thought of grammar as a set of analogical relationships between whole words organized into paradigms. More recent work in the framework of Word and Paradigm Morphology (e.g. Blevins 2016) has tried to describe constraints on possible paradigms and explain where they come from. In particular, Ackerman and Malouf (2013) propose that paradigms cross-linguistically tend to have low conditional entropy, meaning a language learner hearing a single inflected form is in a relatively low state of uncertainty about the rest of the paradigm. Languages with high conditional entropy should be harder to learn and more diachronically unstable. This experiment tests the low conditional entropy conjecture in a variant of an iterated learning setting, looking at whether adult language learners in the lab are likely to mislearn high-entropy paradigms in the direction of lowering their entropy.
Snacks provided!