Open Access. Powered by Scholars. Published by Universities.®

Social and Behavioral Sciences Commons

Open Access. Powered by Scholars. Published by Universities.®

Computational Linguistics

University of Massachusetts Amherst

Phonology

Articles 1 - 5 of 5

Full-Text Articles in Social and Behavioral Sciences

Phonotactic Learning With Distributional Representations, Max A. Nelson Oct 2022

Phonotactic Learning With Distributional Representations, Max A. Nelson

Doctoral Dissertations

This dissertation explores the possibility that the phonological grammar manipulates phone representations based on learned distributional class memberships rather than those based on substantive linguistic features. In doing so, this work makes three primary contributions. First, I propose three novel algorithms for learning a phonological class system from the distributional statistics of a language, all of which are based on partitioning graph representations of phone distributions. Second, I propose a new method for fitting Maximum Entropy phonotactic grammars, MaxEntGrams, which offers theoretical complexity improvements over the widely-adopted approach taken by Hayes and Wilson [2008]. Third, I present a series of …


Learning Phonology With Sequence-To-Sequence Neural Networks, Brandon Prickett Jun 2021

Learning Phonology With Sequence-To-Sequence Neural Networks, Brandon Prickett

Doctoral Dissertations

This dissertation tests sequence-to-sequence neural networks to see whether they can simulate human phonological learning and generalization in a number of artificial language experiments. These experiments and simulations are organized into three chapters: one on opaque interactions, one on computational complexity in phonology, and one on reduplication. The first chapter focuses on two biases involving interactions that have been proposed in the past: a bias for transparent patterns and a bias for patterns that maximally utilize all of the processes in a language. The second chapter looks at harmony patterns of varying complexity to see whether both Formal Language Theory …


Emergent Typological Effects Of Agent-Based Learning Models In Maximum Entropy Grammar, Coral Hughto Dec 2020

Emergent Typological Effects Of Agent-Based Learning Models In Maximum Entropy Grammar, Coral Hughto

Doctoral Dissertations

This dissertation shows how a theory of grammatical representations and a theory of learning can be combined to generate gradient typological predictions in phonology, predicting not only which patterns are expected to exist, but also their relative frequencies: patterns which are learned more easily are predicted to be more typologically frequent than those which are more difficult. In Chapter 1 I motivate and describe the specific implementation of this methodology in this dissertation. Maximum Entropy grammar (Goldwater & Johnson 2003) is combined with two agent-based learning models, the iterated and the interactive learning model, each of which mimics a type …


Extending Hidden Structure Learning: Features, Opacity, And Exceptions, Aleksei I. Nazarov Nov 2016

Extending Hidden Structure Learning: Features, Opacity, And Exceptions, Aleksei I. Nazarov

Doctoral Dissertations

This dissertation explores new perspectives in phonological hidden structure learning (inferring structure not present in the speech signal that is necessary for phonological analysis; Tesar 1998, Jarosz 2013a, Boersma and Pater 2016), and extends this type of learning towards the domain of phonological features, towards derivations in Stratal OT (Bermúdez-Otero 1999), and towards exceptionality indices in probabilistic OT. Two more specific themes also come out: the possibility of inducing instead of pre-specifying the space of possible hidden structures, and the importance of cues in the data for triggering the use of hidden structure. In chapters 2 and 4, phonological features …


Computational Modeling Of Learning Biases In Stress Typology, Robert D. Staubs Nov 2014

Computational Modeling Of Learning Biases In Stress Typology, Robert D. Staubs

Doctoral Dissertations

This dissertation demonstrates a strong connection between the frequency of stress patterns and their relative learnability under a wide class of learning algorithms. These frequency results follow from hypotheses about the learner's available representations and the distribution of input data. Such hypotheses are combined with a model of learning to derive distinctions between classes of stress patterns, addressing frequency biases not modeled by traditional generative theory. I present a series of results for error-driven learners of constraint-based grammars. These results are shown both for single learners and learners in an iterated learning model. First, I show that with general n …