Open Access. Powered by Scholars. Published by Universities.®

Syntax Commons

Open Access. Powered by Scholars. Published by Universities.®

243 Full-Text Articles 195 Authors 51,692 Downloads 42 Institutions

All Articles in Syntax

Faceted Search

243 full-text articles. Page 1 of 6.

Lulling Waters: A Poetry Reading For Real-Time Music Generation Through Emotion Mapping, Ashley Muniz, Toshihisa Tsuruoka 2020 New York University

Lulling Waters: A Poetry Reading For Real-Time Music Generation Through Emotion Mapping, Ashley Muniz, Toshihisa Tsuruoka

Electronic Literature Organization Conference 2020

Through a poetic narrative, “Lulling Waters” tells the story of a whale overcoming the loss of his mother, who passed away from ingesting plastic, as he attempts to escape from the polluted oceanic world. The live performance of this poem utilizes a software system called Soundwriter, which was developed with the goal of enriching the oral storytelling experience through music. This video demonstrates how Soundwriter’s real-time hybrid system was able to analyze “Lulling Waters” through its lexical and auditory features. Emotionally salient words were given ratings based on arousal, valence, and dominance while the emotionally charged prosodic features of ...


Doing Away With Defaults: Motivation For A Gradient Parameter Space, Katherine Howitt 2020 The Graduate Center, City University of New York

Doing Away With Defaults: Motivation For A Gradient Parameter Space, Katherine Howitt

All Dissertations, Theses, and Capstone Projects

In this thesis, I propose a reconceptualization of the traditional syntactic parameter space of the principles and parameters framework (Chomsky, 1981). In lieu of binary parameter settings, parameter values exist on a gradient plane where a learner’s knowledge of their language is encoded in their confidence that a particular parametric target value, and thus grammatical construction of an encountered sentence, is likely to be licensed by their target grammar. First, I discuss other learnability models in the classic parameter space which lack either psychological plausibility, theoretical consistency, or some combination of the two. Then, I argue for the Gradient ...


A Generative Approach To Oscan Syntax: Towards An Analysis Of The Conditional Construction, Jennifer McLish 2020 Washington University in St. Louis

A Generative Approach To Oscan Syntax: Towards An Analysis Of The Conditional Construction, Jennifer Mclish

Senior Honors Papers / Undergraduate Theses

This thesis presents an analysis of some aspects of the syntax of Oscan, a dead language from the Italic family, with a focus on the conditional construction. Drawing on modern approaches to the syntax of the related language Latin, I show that deviation from the default SOV word order of Oscan can be described in terms of discourse-marking focus and topic movement. Due to the frequent appearance of imperatives in conditional constructions, I address the syntax of imperatives in some detail. Applying current generative theories of the imperative to the Oscan consequent clause, I conclude that the Oscan imperative is ...


Optimal Linearization: Prosodic Displacement In Khoekhoegowab And Beyond, Leland Kusmer 2020 University of Massachusetts Amherst

Optimal Linearization: Prosodic Displacement In Khoekhoegowab And Beyond, Leland Kusmer

Doctoral Dissertations

Understanding the relationship between syntactic structures and linear strings is a challenge for modern syntactic theories. The most complete and widely accepted models — namely, the Headedness Parameter and the Linear Correspondence Axiom (Kayne, 1994) — each capture aspects of this relationship, but are either too permissive or two restrictive: A Headedness Parameter relativized to individual categories permits nearly any linear order which keeps phrases contiguous, even those that violate the Final-Over-Final Constraint (Sheehan et al. 2017); by contrast, the Linear Correspondence Axiom is well-known for ruling out head-final configurations generally. Subsequent models of linearization have typically been modifications of one of ...


Copulative Compounds Formed By Prepositional Interfixes In Persian Language, Nuriddinov Nodir Mr 2020 Tashkent state institute of oriental studies

Copulative Compounds Formed By Prepositional Interfixes In Persian Language, Nuriddinov Nodir Mr

Scientific Bulletin of Namangan State University

Copulative compounds are such compound words that are formed by lexicalization. The components of copulative compounds are equal by semantic. The article also discusses the term "copulative compound" and its studying in scientific works of Russian and Iranian linguists. As well as with examples collected from sources it is carried out structural-semantic analysis of copulative compounds formed by prepositional interfixes.


Computational Approaches To The Syntax–Prosody Interface: Using Prosody To Improve Parsing, Hussein M. Ghaly 2020 The Graduate Center, City University of New York

Computational Approaches To The Syntax–Prosody Interface: Using Prosody To Improve Parsing, Hussein M. Ghaly

All Dissertations, Theses, and Capstone Projects

Prosody has strong ties with syntax, since prosody can be used to resolve some syntactic ambiguities. Syntactic ambiguities have been shown to negatively impact automatic syntactic parsing, hence there is reason to believe that prosodic information can help improve parsing. This dissertation considers a number of approaches that aim to computationally examine the relationship between prosody and syntax of natural languages, while also addressing the role of syntactic phrase length, with the ultimate goal of using prosody to improve parsing.

Chapter 2 examines the effect of syntactic phrase length on prosody in double center embedded sentences in French. Data collected ...


Knowledge Of The Present Perfect By Albanian/English Bilinguals, Erjon Xholi 2020 The Graduate Center, City University of New York

Knowledge Of The Present Perfect By Albanian/English Bilinguals, Erjon Xholi

All Dissertations, Theses, and Capstone Projects

This paper concerns the acquisition process of a specific part of English grammar by native speakers of Albanian. The focus is the English present perfect, and the similarities and differences that it bears to the Albanian Compound Perfective. The two constructions are made from similar parts, but they crucially differ in the aspectual nature of their participles. While the Albanian particle is perfective, the English is underspecified. We argue that the process of the acquisition of the PP by Albanian bilinguals is one where input, analogy, and direct grammar teaching do not suffice. We apply Generative Grammar logic to the ...


Mg Parsing As A Model Of Gradient Acceptability In Syntactic Islands, Aniello De Santo 2020 Stony Brook University

Mg Parsing As A Model Of Gradient Acceptability In Syntactic Islands, Aniello De Santo

Proceedings of the Society for Computation in Linguistics

It is well-known that the acceptability judgments at the core of current syntactic theories are continuous. However, an open debate is whether the source of such gradience is situated in the grammar itself, or can be derived from extra-grammatical factors. In this paper, we propose the use of a top-down parser for Minimalist grammars (Stabler, 2013; Kobele et al., 2013; Graf et al., 2017), as a formal model of how gradient acceptability can arise from categorical grammars. As a test case, we target the acceptability judgments for island effects collected by Sprouse et al. (2012a).


Assessing The Ability Of Transformer-Based Neural Models To Represent Structurally Unbounded Dependencies, Jillian K. Da Costa, Rui P. Chaves 2020 University at Buffalo

Assessing The Ability Of Transformer-Based Neural Models To Represent Structurally Unbounded Dependencies, Jillian K. Da Costa, Rui P. Chaves

Proceedings of the Society for Computation in Linguistics

Filler-gap dependencies are among the most challenging syntactic constructions for com- putational models at large. Recently, Wilcox et al. (2018) and Wilcox et al. (2019b) provide some evidence suggesting that large-scale general-purpose LSTM RNNs have learned such long-distance filler-gap dependencies. In the present work we provide evidence that such models learn filler-gap dependencies only very imperfectly, despite being trained on massive amounts of data. Finally, we compare the LSTM RNN models with more modern state-of-the-art Transformer models, and find that these have poor-to-mixed degrees of success, despite their sheer size and low perplexity.


Modeling The Learning Of The Person Case Constraint, Adam Liter, Naomi H. Feldman 2020 University of Maryland, College Park

Modeling The Learning Of The Person Case Constraint, Adam Liter, Naomi H. Feldman

Proceedings of the Society for Computation in Linguistics

Many domains of linguistic research posit feature bundles as an explanation for various phenomena. Such hypotheses are often evaluated on their simplicity (or parsimony). We take a complementary approach. Specifically, we evaluate different hypotheses about the representation of person features in syntax on the basis of their implications for learning the Person Case Constraint (PCC). The PCC refers to a phenomenon where certain combinations of clitics (pronominal bound morphemes) are disallowed with ditransitive verbs. We compare a simple theory of the PCC, where person features are represented as atomic units, to a feature-based theory of the PCC, where person features ...


The Role Of Information Theory In Gap-Filler Dependencies, Gregory Kobele, Linyang He, Ming Xiang 2020 University of Leipzig

The Role Of Information Theory In Gap-Filler Dependencies, Gregory Kobele, Linyang He, Ming Xiang

Proceedings of the Society for Computation in Linguistics

The current study examines how formal grammar and information-theoretic complexity metrics can combine to account for processing cost incurred during incremental sentence comprehension. To this end, we modeled the eye-movement reading measures obtained from an experiment on the wh-in-situ construction in Mandarin Chinese. Framing our syntactic analysis in minimalist grammars, we obtained estimates of grammatical choice-point probabilities from the Penn Chinese Tree Bank, and derived values for two different complexity metrics, suprisal and entropy reduction, at each word of the target sentences. Both metrics accounted for a small but significant amount of the eye-movement data.


A Closer Look At The Performance Of Neural Language Models On Reflexive Anaphor Licensing, Jennifer Hu, Sherry Y. Chen, Roger P. Levy 2020 Massachusetts Institute of Technology

A Closer Look At The Performance Of Neural Language Models On Reflexive Anaphor Licensing, Jennifer Hu, Sherry Y. Chen, Roger P. Levy

Proceedings of the Society for Computation in Linguistics

An emerging line of work uses psycholinguistic methods to evaluate the syntactic generalizations acquired by neural language models (NLMs). While this approach has shown NLMs to be capable of learning a wide range of linguistic knowledge, confounds in the design of previous experiments may have obscured the potential of NLMs to learn certain grammatical phenomena. Here we re-evaluate the performance of a range of NLMs on reflexive anaphor licensing. Under our paradigm, the models consistently show stronger evidence of learning than reported in previous work. Our approach demonstrates the value of well-controlled psycholinguistic methods in gaining a fine-grained understanding of ...


What Don't Rnn Language Models Learn About Filler-Gap Dependencies?, Rui P. Chaves 2020 University at Buffalo

What Don't Rnn Language Models Learn About Filler-Gap Dependencies?, Rui P. Chaves

Proceedings of the Society for Computation in Linguistics

In a series of experiments, Wilcox et al. (2019,2019) provide evidence suggesting that general-purpose state-of-the-art LSTM RNN language models have not only learned English filler-gap dependencies, but also some of their associated 'island' constraints (Ross 1967). In the present paper, I cast doubt on such claims, and argue that upon closer inspection filler-gap dependencies are learned only very imperfectly, including their associated island constraints. I conjecture that the LSTM RNN models in question have more likely learned some surface statistical regularities in the dataset rather than higher-level abstract generalizations about the linguistic mechanisms underlying filler-gap constructions.


Binding And Coreference In Vietnamese, Thuy Bui 2019 University of Massachusetts Amherst

Binding And Coreference In Vietnamese, Thuy Bui

Doctoral Dissertations

This dissertation investigates the real-time comprehension and final interpretation of object pronouns in Vietnamese, a language in which reflexive and non-reflexive pronominal forms have overlapping meanings. It addresses the questions of whether and how Principle B is applied as a structural constraint to determine the appropriate antecedent for pronouns in the language. The central argument is that Vietnamese speakers rely on two distinct mechanisms to resolve anaphoric relations: Within a pronoun's local domain, even though coreference is highly permissive, binding is strictly prohibited. Results from three two-alternative forced choice and three self-paced reading experiments show consistent profiles for both ...


Computing Agreement In A Mixed System, Sakshi Bhatia 2019 University of Massachusetts Amherst

Computing Agreement In A Mixed System, Sakshi Bhatia

Doctoral Dissertations

This dissertation develops a comprehensive response to the question of how agreement is computed in Hindi-Urdu – a language with a mixed agreement system where the verb can agree with a subject or an object depending on the structural context. This dissertation covers new empirical and theoretical ground in two domains. First, I identify three kinds of atypical agreement patterns which are not accounted for under traditional approaches Hindi-Urdu agreement -- verb agreement with the nominal component of Noun-Verb complex predicates, long distance agreement of embedding Adjective-Verb predicates with embedded infinitive clause objects, and copular agreement in identity copula structures. Second, I ...


Spanish Nominalizations And Case Assignment, Dr. Jeff Renaud, Tania Leal 2019 Augustana College, Rock Island Illinois

Spanish Nominalizations And Case Assignment, Dr. Jeff Renaud, Tania Leal

Celebration of Learning

Nominalizations are syntactic structures wherein verbal roots co-occur with verbal and nominal properties, classifying them as verbal (VN) (El andar el niño tan tarde) or nominal (NN) (El andar errabundo del niño). While NNs mark agents genitive (del niño), VNs require nominative agents (el niño). NNs co-occur with adjectives (errabundo), whereas VNs co-occur with adverbs (tan tarde). Alexiadou et al. (2011) posit separate syntactic structures for the two. In this study, we investigate via self-paced reading task the types of case available in each structure, providing evidence of the processing of Spanish nominalizations and testing Alexiadou et al.'s (2011 ...


Cue-Based Reflexive Reference Resolution: Evidence From Korean Reflexive Caki, Namseok Yong 2019 The Graduate Center, City University of New York

Cue-Based Reflexive Reference Resolution: Evidence From Korean Reflexive Caki, Namseok Yong

All Dissertations, Theses, and Capstone Projects

This dissertation aims to reveal cognitive mechanisms and factors that underlie the reflexive dependency formation. In recent years, a lot of attention has been paid to the question of how our mind works in building linguistic dependencies (including an antecedent-reflexive dependency) because relevant research has proved promising and illuminating in regard to the properties (e.g., system architecture, computational algorithms, etc.) of human language processor and its close connection with other cognitive functions such as memory (Lewis & Vasishth, 2005; Lewis, Vasishth, & Van Dyke, 2006; McElree, 2000; McElree, Foraker, & Dyer, 2003; Van Dyke & Johns, 2012; Wagers, Lau, & Phillips, 2009). Building upon this line of research, the present dissertation provides empirical evidence to show that the parser can directly access potential antecedents (stored in memory) in forming an antecedent-reflexive dependency, using various linguistic cues and contextual knowledge available at the reflexive.

In order to make this claim, this dissertation examines the Korean mono-morphemic reflexive caki ‘self’ (also known as a long-distance anaphor), using acceptability judgment and self-paced reading methodologies, and asks (i) what linguistic factors guide its reference resolution and (ii) how they are applied to cognitive processes for memory retrieval and phrase structure building.

A series of acceptability judgment experiments (Experiments 1 through 5) show that caki has a very robust referential bias: it strongly prefers a subject antecedent. Moreover, it is established that syntactic constraints (e.g., binding constraints) are not the only available source of information during caki’s reference resolution. Indeed, various non-syntactic sources of information (or cues) can also determine caki’s reference resolution. Three self-paced reading experiments (Experiments 6 through 8) provide evidence compatible with the direct-access content-addressable memory retrieval model (Lewis & Vasishth, 2005; Lewis et al., 2006; McElree, 2000; Van Dyke & McElree, 2011)

Based on these experimental findings, I present an explanation of why caki preferentially forms a dependency with a subject antecedent. I argue that caki’s subject antecedent bias is driven both externally (i.e., syntactic prominence of a grammatical subject and first-mention advantage) and internally (i.e., frequency-based prediction on caki-subject dependency relation). Finally, I showcase how a referential dependency between caki and a potential antecedent can be constructed by the cue-based retrieval parser (Lewis et al., 2006; Van Dyke & Lewis, 2003).


Heritage Speaker And Late Bilingual L2 Relative Clause Processing And Language Dominance Effects, LeeAnn S. Stevens 2019 The Graduate Center, City University of New York

Heritage Speaker And Late Bilingual L2 Relative Clause Processing And Language Dominance Effects, Leeann S. Stevens

All Dissertations, Theses, and Capstone Projects

Traditionally, heritage speakers are recognized as a heterogeneous group whose skills in their heritage language are unlike those of monolinguals or L2 learners of that language. Indeed, much evidence confirms the cognitive and linguistic uniqueness of this population. However, highly proficient heritage speakers may pattern more similarly to another bilingual population than typically assumed: first-generation late bilinguals.

The present study examines group-level processing differences between Spanish heritage speakers and Spanish-English late bilinguals in English, the second-learned and current societal majority language of these populations. Dominance is also analyzed as a possible effect of group processing differences, since traditionally and definitionally ...


Prepositional Phrase Attachment Ambiguities In Declarative And Interrogative Contexts: Oral Reading Data, Tyler J. Peckenpaugh 2019 The Graduate Center, City University of New York

Prepositional Phrase Attachment Ambiguities In Declarative And Interrogative Contexts: Oral Reading Data, Tyler J. Peckenpaugh

All Dissertations, Theses, and Capstone Projects

Certain English sentences containing multiple prepositional phrases (e.g., She had planned to cram the paperwork in the drawer into her briefcase) have been reported to be prone to mis-parsing of a kind that is standardly called a “garden path.” The mis-parse stems from the temporary ambiguity of the first prepositional phrase (PP1: in the drawer), which tends to be interpreted initially as the goal argument of the verb cram. If the sentence ended there, that would be correct. But that analysis is overridden when the second prepositional phrase (PP2: into her briefcase) is encountered, since the into phrase can ...


Do It Like A Syntactician: Using Binary Gramaticality Judgements To Train Sentence Encoders And Assess Their Sensitivity To Syntactic Structure, Pablo Gonzalez Martinez 2019 The Graduate Center, City University of New York

Do It Like A Syntactician: Using Binary Gramaticality Judgements To Train Sentence Encoders And Assess Their Sensitivity To Syntactic Structure, Pablo Gonzalez Martinez

All Dissertations, Theses, and Capstone Projects

The binary nature of grammaticality judgments and their use to access the structure of syntax are a staple of modern linguistics. However, computational models of natural language rarely make use of grammaticality in their training or application. Furthermore, developments in modern neural NLP have produced a myriad of methods that push the baselines in many complex tasks, but those methods are typically not evaluated from a linguistic perspective. In this dissertation I use grammaticality judgements with artificially generated ungrammatical sentences to assess the performance of several neural encoders and propose them as a suitable training target to make models learn ...


Digital Commons powered by bepress