An exploration of the formal learnability properties of repetition and optionality, with new ongoing work on how and whether neural models generalise from limited repetition and optionality.
Human languages include adjuncts, which are grammatically optional
elements. Some adjuncts can also be repeated indefinitely. I
consider four learnable classes of languages and ask whether
these classes include optional and repeated elements, and what
input a learner requires in order to generalise from finite to
indefinite repetition.
Keywords: adjunct, adjective, optionality, repetition, learnability, PDFA, 0-reversible, n-gram, substitutable context free
In human language, not all repeatable elements are optional,
and not all optional elements are repeatable. However, this work
does not yet include learners for human-like languages; stay
tuned!
@incollection{fowlie2014learning,
title={Learning Adjuncts},
author={Meaghan Fowlie},
booktitle={Connectedness: papers by and for {S}arah {V}an{W}agenen},
series={UCLA Working Papers in Linguistics},
editor={Carson Schutze and Linnaea Stockall},
volume={18},
pages={},
year={2014}
}
@phdthesis{fowlie2017slaying,
title={Slaying the Great Green Dragon: Learning and modelling iterable ordered optional adjuncts},
author={Fowlie, Meaghan},
year={2017},
school={UCLA}
}