A Theory of Individual-Level Predicates Based on Blind Mandatory Implicatures : Constraint Promotion for Optimality Theory

Magri, Giorgio, 2009

for $24.95 x

Part I of this dissertation proposes an implicature-based theory of individual-level predicates. The idea is that we cannot say '#John is sometimes tall' because the sentence triggers the scalar implicature that the alternative 'John is always tall' is false and this implicature mismatches with the piece of common knowledge that tallness is a permanent property. Chapter 1 presents the idea informally. This idea faces two challenges. First, this scalar implicature must be mandatory and furthermore blind to common knowledge. Second, it is not clear how this idea extends to other properties of individual-level predicates. Chapter 2 makes sense of the surprising nature of these special mismatching implicatures within the recent grammatical framework for scalar implicatures of Chierchia (2004) and Fox (2007a). Chapter 3 shows how this implicature-based account can be extended to other properties of individuallevel predicates, such as restrictions on their bare plural subjects, on German word order and extraction, and on Q-adverbs. Part H of this dissertation develops a theory of update rules for the OT on-line algorithm that perform constraint promotion too, besides demotion. Chapter 4 explains why we need constraint promotion, by arguing that demotion-only update rules are unable to model Hayes' (2004) early stage of the acquisition of phonology. Chapter 5 shows how to get constraint promotion, by means of two different techniques. One technique shares the combinatoric flavor of Tesar and Smolensky's analysis of demotion-only update rules. The other technique consists of adapting to OT results from the theory of on-line algorithms for linear classification. The latter technique has various consequences interesting on their own, explored in Chapter 8. Chapters 6 and 7 start the investigation of the properties of update rules that perform promotion too, concentrating on the characterization of the final vector and on the number of updates.