This series aims to publish high quality books that contribute to a better understanding of phenomena in morphology, syntax and the interplay between both. Since the recent history in linguistics showed that basic assumptions may change rather quickly, the ideal submission to this series is one that separates data discussion resulting in pre-theoretic generalizations and analysis. Such a separation ensures that published books will have some value even if the linguistic framework is radically changed or becomes irrelevant one day. The separation of the description of the phenomenon from the analysis ensures that readers that are non-experts in more formal frameworks will find the books useful. Such a separation is not a strict precondition for acceptance in this series, but rather a suggestion. In any case the work should reflect a good understanding of the data and provide some new insights.
The data discussion should be theoretically informed: Positive and negative examples appearing in the literature on the phenomena discussed should be included in the discussion, regardless of the specific framework used in the publication. Controversial examples ideally are backed up by experimental or corpus data.
We invite submissions from all frameworks and theoretical schools. Thorough formalization of the kind that is known for example from work in HPSG, LFG, or TAG is not required but is highly welcome. Work that is backed up by a computer-processable implementation is also eligible to be published in the subseries Implemented Grammars.
The volume discusses the breadth of applications for an extended notion of paradigm. Paradigms in this sense are not only tools of morphological description but constitute the inherent structure of grammar. Grammatical paradigms are structural sets forming holistic, semiotic structures with an informational value of their own. We argue that as such, paradigms are a part of speaker knowledge and provide necessary structuring for grammaticalization processes. The papers discuss theoretical as well as conceptual questions and explore different domains of grammatical phenomena, ranging from grammaticalization, morphology, and cognitive semantics to modality, aiming to illustrate what the concept of grammatical paradigms can and cannot (yet) explain.Weniger anzeigen
Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism).Weniger anzeigen
This book addresses the complexity of Russian verbal prefixation system that has been extensively studied but yet not explained. Traditionally, different meanings have been investigated and listed in the dictionaries and grammars and more recently linguists attempted to unify various prefix usages under more general descriptions. The existent semantic approaches, however, do not aim to use semantic representations in order to account for the problems of prefix stacking and aspect determination. This task has been so far undertaken by syntactic approaches to prefixation, that divide verbal prefixes in classes and limit complex verb formation by restricting structural positions available for the members of each class. I show that these approaches have two major drawbacks: the implicit prediction of the non-existence of complex biaspectual verbs and the absence of uniformly accepted formal criteria for the underlying prefix classification. In this book the reader can find an implementable formal semantic approach to prefixation that covers five prefixes: za-, na-, po-, pere-, and do-. It is shown how to predict the existence, semantics, and aspect of a given complex verb with the help of the combination of an LTAG and frame semantics. The task of identifying the possible affix combinations is distributed between three modules: syntax, which is kept simple (only basic structural assumptions), frame semantics, which ensures that the constraints are respected, and pragmatics, which rules out some prefixed verbs and restricts the range of available interpretations. For the purpose of the evaluation of the theory, an implementation of the proposed analysis for a grammar fragment using a metagrammar description is provided. It is shown that the proposed analysis delivers more accurate and complete predictions with respect to the existence of complex verbs than the most precise syntactic account.Weniger anzeigen
The standard view of the form-meaning interfaces, as embraced by the great majority of contemporary grammatical frameworks, consists in the assumption that meaning can be associated with grammatical form in a one-to-one correspondence. Under this view, composition is quite straightforward, involving concatenation of form, paired with functional application in meaning. In this book, we discuss linguistic phenomena across several grammatical sub-modules (morphology, syntax, semantics) that apparently pose a problem to the standard view, mapping out the potential for deviation from the ideal of one-to-one correspondences, and develop formal accounts of the range of phenomena. We argue that a constraint-based perspective is particularly apt to accommodate deviations from one-to-many correspondences, as it allows us to impose constraints on full structures (such as a complete word or the interpretation of a full sentence) instead of deriving such structures step by step.
Most of the papers in this volume are formulated in a particular constraint-based grammar framework, Head-driven Phrase Structure Grammar. The contributions investigate how the lexical and constructional aspects of this theory can be combined to provide an answer to this question across different linguistic sub-theories.Weniger anzeigen
Wie in vielen anderen Sprachen der Welt hat sich auch im Deutschen der Definitartikel aus einem adnominal gebrauchten Demonstrativum herausgebildet. In der vorliegenden Arbeit wird dieser funktionale Wandel, der sich vornehmlich in der althochdeutschen Sprachperiode (750–1050 n. Chr.) abspielte, erstmals computergestützt und mit korpuslinguistischen Methoden anhand der fünf größten ahd. Textdenkmäler aus dem Referenzkorpus Altdeutsch rekonstruiert. Dabei wird die Entwicklung des Definitartikels als Konstruktionalisierung der Struktur [dër + N] begriffen: Das ursprüngliche Demonstrativum dër verliert seine zeigende Bedeutung und erschließt neue Gebrauchskontexte, in denen die eindeutige Identifizierbarkeit des Referenten auch unabhängig von der Gesprächssituation gewährleistet ist. In der Arbeit wird gezeigt, dass diese Kontextexpansion maßgeblich von der kognitiv-linguistischen Kategorie Belebtheit beeinflusst wird.Weniger anzeigen
The organization of the lexicon, and especially the relations between groups of lexemes is a strongly debated topic in linguistics. Some authors have insisted on the lack of any structure of the lexicon. In this vein, Di Sciullo & Williams (1987: 3) claim that “[t]he lexicon is like a prison – it contains only the lawless, and the only thing that its inmates have in commonis lawlessness”. In the alternative view, the lexicon is assumed to have a rich structure that captures all regularities and partial regularities that exist between lexical entries.Two very different schools of linguistics have insisted on the organization of the lexicon. On the one hand, for theories like HPSG (Pollard & Sag 1994), but also some versions of construction grammar (Fillmore & Kay 1995), the lexicon is assumed to have a very rich structure which captures common grammatical properties between its members. In this approach, a type hierarchy organizes the lexicon according to common properties between items. For example, Koenig (1999: 4, among others), working from an HPSG perspective, claims that the lexicon “provides a unified model for partial regularties, medium-size generalizations, and truly productive processes”. On the other hand, from the perspective of usage-based linguistics, several authors have drawn attention to the fact that lexemes which share morphological or syntactic properties, tend to be organized in clusters of surface (phonological or semantic) similarity (Bybee & Slobin 1982; Skousen 1989; Eddington 1996). This approach, often called analogical, has developed highly accurate computational and non-computational models that can predict the classes to which lexemes belong. Like the organization of lexemes in type hierarchies, analogical relations between items help speakers to make sense of intricate systems, and reduce apparent complexity (Köpcke & Zubin 1984). Despite this core commonality, and despite the fact that most linguists seem to agree that analogy plays an important role in language, there has been remarkably little work on bringing together these two approaches. Formal grammar traditions have been very successful in capturing grammatical behaviour, but, in the process, have downplayed the role analogy plays in linguistics (Anderson 2015). In this work, I aim to change this state of affairs. First, by providing an explicit formalization of how analogy interacts with grammar, and second, by showing that analogical effects and relations closely mirror the structures in the lexicon. I will show that both formal grammar approaches, and usage-based analogical models, capture mutually compatible relations in the lexicon.Weniger anzeigen
After being dominant during about a century since its invention by Baudouin de Courtenay at the end of the nineteenth century, morpheme is more and more replaced by lexeme in contemporary descriptive and theoretical morphology.
The notion of a lexeme is usually associated with the work of P. H. Matthews (1972, 1974), who characterizes it as a lexical entity abstracting over individual inflected words. Over the last three decades, the lexeme has become a cornerstone of much work in both inflectional morphology and word formation (or, as it is increasingly been called, lexeme formation). The papers in the present volume take stock of the descriptive and theoretical usefulness of the lexeme, but also adress many of the challenges met by classical lexeme-based theories of morphology.Weniger anzeigen
On Looking into Words is a wide-ranging volume spanning current research into word structure and morphology, with a focus on historical linguistics and linguistic theory. The papers are offered as a tribute to Stephen R. Anderson, the Dorothy R. Diebold Professor of Linguistics at Yale, who is retiring at the end of the 2016-2017 academic year. The contributors are friends, colleagues, and former students of Professor Anderson, all important contributors to linguistics in their own right. As is typical for such volumes, the contributions span a variety of topics relating to the interests of the honorand. In this case, the central contributions that Anderson has made to so many areas of linguistics and cognitive science, drawing on synchronic and diachronic phenomena in diverse linguistic systems, are represented through the papers in the volume.
The 26 papers that constitute this volume are unified by their discussion of the interplay between synchrony and diachrony, theory and empirical results, and the role of diachronic evidence in understanding the nature of language. Central concerns of the volume include morphological gaps, learnability, increases and declines in productivity, and the interaction of different components of the grammar. The papers deal with a range of linked synchronic and diachronic topics in phonology, morphology, and syntax (in particular, cliticization), and their implications for linguistic theory.Weniger anzeigen
In this book, I propose a grammar fragment which accounts for the main properties of two elliptical constructions (‘lacking’ for the verbal head) called gapping (1a) and verbless relative adjuncts (therefore, VRA) (1b) respectively.
(1) a. Jean aime les pommes [et Marie les bananes]. 'Jean likes apples and Maria bananas'
b. Plusieurs personnes sont venues cette semaine, [dont Marie (hier)]. 'Several people have come this week, among which Marie yesterday'
We mainly argue for the fact that elliptical clauses in gapping and VRA constructions don’t behave as regular verbal clauses. Their syntactic and semantic properties don’t provide evidence for deriving this kind of elliptical clause from a complete clause. An analysis in terms of syntactic reconstruction of the missing material is therefore inadequate. Thus, the elliptical clause in both constructions (gapping and VRA) has a specific syntactic behaviour and must be assigned an independent status in the grammar, more precisely the status of a fragmentary clause, that is a syntactic unit having a propositional content of message type, but an incomplete syntax. This dissertation gives new arguments in favor of a semantic reconstruction with parallelism constraints, cf. Ginzburg & Sag (2000), Culicover & Jackendoff (2005).Weniger anzeigen
Diese Arbeit untersucht das Verhältnis zwischen Syntaxmodell und lexikalischen Valenzeigenschaften anhand der Familie der Baumadjunktionsgrammatiken (TAG) und anhand der Phänomenbereiche Kohärenz und Ellipse. Wie die meisten prominenten Syntaxmodelle betreibt TAG eine Amalgamierung von Syntax und Valenz, die oft zu Realisierungsidealisierungen führt. Es wird jedoch gezeigt, dass TAG dabei gewisse Realisierungsidealisierungen vermeidet und Diskontinuität bei Kohärenz direkt repräsentieren kann; dass TAG trotzdem und trotz der im Vergleich zu GB, LFG und HPSG wesentlich eingeschränkten Ausdrucksstärke zu einer linguistisch sinnvollen Analyse kohärenter Konstruktionen herangezogen werden kann; dass der TAG-Ableitungsbaum für die indirekte Gapping-Modellierung eine ausreichend informative Bezugsgröße darstellt. Für die direkte Repräsentation von Gapping-Strukturen wird schließlich ein baumbasiertes Syntaxmodell, STUG, vorgeschlagen, in dem Syntax und Valenz getrennt, aber verlinkt sind.Weniger anzeigen