UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

The relationship between grammar and cognition Carter, Ron 1995

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
831-ubc_1995-0209.pdf [ 3.97MB ]
Metadata
JSON: 831-1.0098930.json
JSON-LD: 831-1.0098930-ld.json
RDF/XML (Pretty): 831-1.0098930-rdf.xml
RDF/JSON: 831-1.0098930-rdf.json
Turtle: 831-1.0098930-turtle.txt
N-Triples: 831-1.0098930-rdf-ntriples.txt
Original Record: 831-1.0098930-source.json
Full Text
831-1.0098930-fulltext.txt
Citation
831-1.0098930.ris

Full Text

THE RELATIONSHIP BETWEEN GRAMMAR AND COGNITION by RON CARTER BA. , The University of British Columbia, 1991 A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS in THE FACULTY OF GRADUATE STUDIES (Department of English) We accept this thesis as conforming to the required standard THE UNIVERSITY OF BRITISH COLUMBIA April 1995 © Ron Carter, 1995 In presenting this thesis in partial fulfilment of the requirements for an advanced degree at the University of British Columbia, I agree that the Library shall make it freely available for reference and study. I further agree that permission for extensive copying of this thesis for scholarly purposes may be granted by the head of my department or by his or her representatives. It is understood that copying or publication of this thesis for financial gain shall not be allowed without my written permission. Department of E » 5 & 1 \ $ \ \ The University of British Columbia Vancouver, Canada Date A^<\\ 'Xgj*S~ DE-6 (2/88) II A B S T R A C T This thesis is an attempt to link basic cognitive processes to attested historical developments in the English language, and, in so doing, to arrive at a plausible, natural theory of grammar that accounts for the form of the language at any stage in its history. The main argument is that relational morphemes such as case inflections and prepositions always derive their meaning from concrete object schemas that develop pre-linguistically from our experience with the world in relation to our bodies and our intentional states. Evidence is drawn from linguistic investigations into case that have served as the catalyst for the discussion about how pre-linguistic categorization affects language structure, cognitive (Langacker) and experientialist (Lakoff) orientations to grammatical structure that take the insights of case grammar and reconcile them with research in cognitive psychology (Rosch), and Artificial Intelligence Research (Parallel Distributed Processing) involving the computer modeling of neural functions. The conclusion is that pre-linguistic relational schemas and therefore spatial cognitive function provide a template for grammatical relationships, and, that the computer modeling of neural function supports such a conclusion. Ill TABLE OF CONTENTS Abstract Table of Contents List of Figures iv Acknowledgement v Chapter 1: Introduction 1 Chapter 2: Case Grammar 10 2.1 G ruber 10 2.2 Fillmore 16 2.3 Anderson 22 Chapter 3: Cognitive Grammar and Categorization 28 3.1 Case Grammar and Cognitive Grammar 28 3.2 Prototypes and Schemas 30 3.3 Child Language Acquisition, Prototypes, 34 and Grammaticalization 3.4 Grammar and Concrete Experience 36 3.5 Object as a Concept that Informs Grammar 38 3.6 Case Relations Follow Concrete Schemas 41 3.7 The Figure/Ground Schema 44 3.8 Prepositions and Conventionalized Object Schemas 46 3.9 Grammatical Relations and Spatial Relations 49 Chapter 4: Parallel Distributed Processing 56 4.1 The Micro-Structural Account 56 4.2 Connectionist PDP Account of Categorization 57 4.3 Learning 60 4.4 Networks and Language 62 Chapter 5: Conclusion 70 References 73 Appendix 1: Glossary of Terms 78 iv LIST OF FIGURES Figure Page 1 Tree Diagram 21 2 Network 59 3 Fractal Diagram 63 V ACKNOWLEDGEMENT I would like to thank Leslie Arnovick for encouraging me to go on, Laurel Brinton, my supervisor, for her patience, advice, and theoretical insights in the area of grammaticalization, and John Cooper for agreeing to read. Most of all, I would like to thank my wife Susan Henderson for not allowing me to give up, and for typing, proofing, reading and being enthusiastic about everything I have ever written (good or bad). CHAPTER 1 Introduction This is a paper about the relationship between physical objects, space, and time. However, it is not a companion to the general theory of relativity; rather, it is an investigation into the contribution that cognitive function in consort with bodily experience makes to our conception of the world, and, specifically, the contribution that visio-spatial cognitive abilities make to the structure of English grammar. In this paper I will take the position that grammar is a development from, and an abstraction from, the experiences we have as embodied cognizers acting in a concrete physical setting (Johnson, 1987). The organization of language should reflect both the cognitive imperative to categorize experience (Rosch, 1978) and the practical imperative to organize information in such a way that it can be successfully communicated. Therefore, grammatical morphemes should be those linguistic elements that organize the nameable things of the world into relational patterns that can be used over and over again within the communicative contexts in which speakers and hearers find themselves. I will draw heavily upon insights gained in some of the less well known areas of investigation into cognition including cognitive grammar, cognitive psychology and the computer modelling of neural functions. Explanation in these areas of research depends upon the idea that human beings function in the world by clumping together hitherto undifferentiated experiences as symbols which can be used to represent those aspects of the environment that 2 have direct significance for human needs, either social or physical, and that the primary experiences of all people are visual-kinetic spatial experience. Although there is a body of opinion that questions whether such a thing as grammar exists as an entity in language separable from situational factors and the communicative goals of speakers (Hopper 1979, 1982), most writers tend to place grammatical items within a continuum of meaning that ranges from fully lexical content words to grammatical morphemes, the meanings of which are limited to restricting in some way the meaning or associative potential of content words. Between these two poles there exists a range of uses or meanings that can be more or less "grammatical" depending on the use to which items are put. From the cognitive grammar perspective, for instance, Langacker claims that "the grammar of a language simply provides its speakers with an inventory of symbolic resources . . . describable by means of symbolic units alone, with lexicon, morphology, and syntax forming a continuum of symbolic structures" (1988.5): Rather than being considered a separate module of language (Chomsky, 1965), grammar is thought of as just one of several kinds of symbolizing strategies that work together according to the communicative purposes of the speaker and hearer. Talmy, in a similar vein, proposes that language consists of two interconnected subsystems, lexicon and grammar, each having a complementary semantic function with respect to the other: "The grammatical specifications in a sentence . . . provide a conceptual framework, or, imagistically, a skeletal structure or scaffolding, for the conceptual material that is lexically specified" (1988:165). The grammatical elements of a language are meaningful because they provide the signs that coordinate the concepts represented by the content words of the language. Grammatical elements form a "cross-linguistically select set of... concepts" (1988:166). They tend to be 3 topological, or relativistic, and to exclude absolute measures either of size, shape, or distance. The analogy here to a physical concept such as "relative placement of objects" will be especially relevant later. I said that this paper is about objects, space, and time; but these things cannot be approached directly, because they are not readily separable from our speech that encodes the concepts themselves. The very use of the words object, space, and time lend to the concepts an a priori status that belies their nature. Each word, in its turn, represents a concept derived from the complex interaction of the human perceptual system and the environment. In other words, these words do not gain their meaning from a dictionary definition, which gives the impression that concepts and facts exist separately in the world, but from their use within contexts (Peirce, 1931). We first need a question, then, based upon an observed phenomenon in language, the answer to which may provide some insight into the larger question of how objects, space, and time are related to one another in the domain that we experience as language. Frequently it has been asserted in the literature on the history of English that Old English originally had a fully operational system of case inflections to mark grammatical relations among nouns and noun-like phrases (nominals). Although there is a certain degree of uncertainty attached to the statement that such a condition existed in Old English, because written evidence does not exist prior to about 600 A.D., the state of the inflectional system in extant Old English and late Old English allows us to conjecture that pre-historic Old English, the period prior to the date of our surviving manuscripts, had a fully inflected case system. The Old English that we have access to, though, has many case syncretisms, or amalgamations of cases, that formerly would have had separate morphological expression. Comrie (1991:45) explains that, where case syncretism occurs, the cases that historically were formally or 4 morphologically distinct can be identified if there remains some formal marking in some of the case paradigms. If, for instance, nominative and accusative are indistinguishable in one paradigm, but are formally separate in another, such as in the pronouns, he claims that fact in itself is sufficient evidence to ascribe a "distributional" case distinction despite the lack of formal distinction within one or more case paradigms. During the Old English period of case syncretization, prepositional phrases and word order became more common as a means of distinguishing formerly inflectional grammatical cases. Historically, in European languages, prepositions have been recruited from the word stock, or lexicon, by a process that has been recognized at least since 1816 when Franz Bopp noted that lexical items developed into "auxiliaries, affixes and finally inflections" (Heine & Claudi, 1991); this process of change from less to more grammatical in function is now called grammaticalization. It turns out, however, that only certain kinds of lexical items are recruited to serve the grammatical function of disambiguating what formerly were distinct morphological cases. Prepositions began in late Old English to take over much of the functional weight formerly born by case markers in early Old English. But the cases that began to take prepositional marking in addition to their syncretized case inflection tended to be ones that had already used prepositions as intensifiers or double markers of case. Cases such as "oblique", encompassing the semantic roles of instrument, cause, accompaniment, location and agent, the genitive, and the dative, encompassing beneficiary and goal, were all candidates for prepositional marking (Heine & Claudi, 1991:61). What are commonly known as the central cases in the information structure of the clause - - nominative and accusative - - did not assume prepositional marking concomitant with syncretism or loss of case marking but, instead, took on as a 5 grammatical marker the already highly frequent practice of a strict linear order: nominative, verb, accusative (S V O). Luraghi (1991) believes that because nominative and accusative are the central arguments in any clause their higher frequency of use has diminished their dependence on overt morphological case markers to indicate their grammatical status within the clause. The nominative and accusative are also "governed" by the verb more directly than are the oblique, dative, or genitive. Since the non-core cases are peripheral to the central arguments of the clause, they need to be governed by some kind of "verb-like" marker in order to establish their relationship with other nominals within the clause; the prepositional adposition, it is proposed, was therefore "recruited" from the lexicon for this role. The fundamental question that must be raised is this: Why have prepositions deriving from adverbs of space been recruited to mark case-like notions in English? Is it an arbitrary selection or do the meanings of these prepositions have a direct bearing on their selection and thus the conceptual organization of the language? I will argue that "spatial expressions are linguistically more basic ... in that they serve as structural templates, as it were, for other expressions" (Lyons, 1977:718) and therefore that spatial cognition is the precursor to linguistically relative or relational concepts such as case, tense, deictics and anaphora. Relativizing grammatical constructions are spatial in three senses. First, the setting in which events take place is a multidimensional space, second, the participants in an event are construed as dimensional objects that traverse paths in accordance with the force dynamics indicated by the verb, and third, the perspective of the observer involves the spatialization of the event in temporal terms. Concomitantly, spatial 6 expressions do not lose their semantic content but continue to exert their primary spatial meanings while organizing discourse (Sweetser, 1990). Much of the recent work in the area of "cognitive" linguistics that directly address the question of the relationship between grammar and cognition owes a debt to the research of Eleanor Rosch (1973-88) concerning human cognition and categorization. Her investigations into the structuring of categories by means of prototypes, and derivations or extensions from those prototypes, has formed the basis for a great deal of speculation about the processes that animate the underlying fundamental organization of grammatical elements. An interest in the processes that segment the continuum of lived sensory experience and aggregate like elements into cognitively manipulatable elements that can be transferred to a linguistic domain lies at the heart of much current research. This research includes that of Langacker, Givon, and Talmy in cognitive grammar, Heine and Claudi, Traugott inter alia in the study of grammaticalization, and Lakoff and Johnson, Haiman, Givon, Traugott, Hopper and Thompson, inter alia in the study of diagrammatic iconicity in grammar. The diagrammatic iconicity hypothesis is of particular interest because it asserts "that linguistic forms are frequently the way they are because, like diagrams, they resemble the conceptual structures they are used to convey; or, that linguistic structures resemble each other because the different conceptual domains they represent are thought of in the same way" (Haiman, 1985:1). Another idea amenable to Rosen's theory is called connectionism, or parallel distributed processing. This area of cognitive psychology and artificial intelligence research seeks a micro-structural or neural-functional explanation for the kinds of phenomena that Rosch and the cognitive linguists have observed at the macro-level. Rumelhart believes, for example, that a "parallel 7 distributed processing" system (PDP) is better suited to simulate brain function than the more common symbol processing computer models used so far in cognitive psychology. The PDP system acts within a relational network where multiple layers of interconnected neurons are either "on" or "off". The (distributed pattern) distribution of "on" and "off" neurons is determined either by an experience in the world or by excitation from neighbouring processing areas in the brain. Unique experiences become represented in the neural network by a pattern of activation, and the retrieval of an experience "is assumed to occur when this previously active pattern is reinstated ..." (1992:72). The more often the unique pattern is reinstated by a similar experience, the greater will become the strength of the memory and the ability to recognize the pattern. The connection with prototype theory is fairly obvious; prototypes develop from the kinds of repeated experience that are represented as activation patterns in the parallel distributed network. Pattern matching, reasoning by mental simulation or "model building", and the "ability to manipulate the environment, so that it comes to represent something" (1992:72) are all fundamental human abilities that can be reconciled to the PDP model; these primary cognitive functions are uncannily similar to the instantiation of prototypes and their derivations in language. From the preceding propositions it follows that primary cognitive functions with respect to language remain constant and that only the means of expressing the organization of content changes diachronically. Therefore, cognitive routines have a primary role in the grammaticalization process and so direct historical developments in predictable ways (Heine & Claudi 1986; Svorou, 1988; Sweetser, 1990). Even from this brief introduction we can see that the description of the mental representation of grammar can be approached from different perspectives. What I would like to propose is that the construction of 8 prototypical patterns and their reconstruction to facilitate the mental simulation of actions in the world, and the transference of patterns from one conceptual domain to another, is a cognitive fundamental discernible at all levels of investigation from the macro to the micro-structural level. Spatial perception in itself, it can be argued, is the outcome of pattern recognition-reconstruction and manipulation. Grammatical organization, I will argue, follows the principles of spatial cognition as a special case of pattern recognition and so can be treated as a variety of general cognitive function (Lakoff, 1988). The consequences of spatial cognition being an intrinsic force within grammatical systems is discernible from the limited number of relational patterns or paradigmatic associations that nominals may hold in a relational space, in the tense-aspect systems that place the speaker in a spatially relational attitude with respect to events spoken about, in deictic specifications and anaphora. Certainly the variety of studies that have been undertaken reflects the multi-stratal nature of the phenomenon in question, and I would like, therefore, to look more closely at each level of analysis beginning from the "top" or end product of the cognitive chain, that being the linguistic evidence, and to work backwards to the micro-structural accounts based upon cognitive psychology. However, there is no point at which explanations from the cognitive perspective are not applicable to the linguistic perspective. In the next sections I would like to gather together some of the more important theoretical insights that have contributed to the emerging theories clustered around the belief that general cognitive abilities contribute to the patterns found in language. This emerging theoretical perspective is in substantial disagreement with the generative paradigm which separates linguistic behaviour from other skills, and, with this in mind, I would first like to explore some of the proposals made by linguists who quite early were 9 dissatisfied with the generativist use of superficial rules to attach surface case markers to lexical strings. Later in the development of the study of the relationship between semantics and grammar, however, cognitive linguists questioned generative semanticist's attempts to posit a semantic base while continuing to adhere to the generativist insistence upon rule-governed grammatical patterns. The generativist desire to reduce language to a system of rules comes as a direct result of a particular way of thinking about language; that is, there is the belief that at some point a language may be characterized by a set of rules that have been formed during language acquisition that describe all and only the sentences of the language, and that describe a "language module" separate from general cognition. Such formalism denies a role to judgments of prototypicality in grammatical patterns. Grammatical patterns based upon conventionalized prototypical instances can accommodate gradations of grammaticality in a way unavailable to the rule-based formalism of transformational theory. We shall look at these matters more closely after this initial digression involving some precursors to our overall theoretical position. In their way, these early protestations against linguistic modularity point to the later formulations of the Spatialization of Form hypothesis by Lakoff and the Prototype-Symbol-Form characterization of language offered by Langacker. 10 CHAPTER 2 Case Grammar 2.1 Gruber One of the first instances of a dissatisfaction with the emerging hegemony of transformational grammar occurred in the mid 1960s. Jeffrey Gruber's 1965 MIT Doctoral Dissertation: Studies in Lexical Relations, updated and extended in 1976 as Lexical Structures in Syntax and Semantics, was near the beginning of a movement in theoretical linguistics that aimed to provide a corrective to Chomsky's relegation of semantics to a separate module of grammar to be dealt with only after the rules of syntax had been fully described. Gruber's ideas are an important first step towards an appreciation of the link between cognition, a theory of grammatical derivation from prototypes, and linguistic behaviour. Gruber is in many ways an early exponent of what later became known as "case grammar" and "generative semantics". He proposes that the basis for syntax lies in a universal set of "pre-lexical" structures that generate lexical items by means of various permutations of prelexical categorial "structures" that combine and become realized as lexical units. Gruber hopes that in the pre-lexical component we will find a "device" that structures meaning in a manner that works across languages. He says: that which is generated in the prelexical base will have validity beyond the language which is being studied in English. This is especially so because of its presumed significance to be the formalization of meaning beyond syntactic form. A particular syntactic form of a given language would be regarded as a reflection and representation of the underlying structure of meaning, which is universal. (1976:4) 11 The preceding statement runs counter to Chomsky's assertion that syntactic well formedness rules exist which are independent of semantic content. Indeed, in the generative model of grammar, syntactic well formedness conditions appear to precede the "filling" of constituent categories with meaning; structure thus precedes meaning. In the framework outlined by Gruber, and subsequently followed by others, the process of relating pre-lexical "structures", which are of a very different order than the "syntactic structures" of generative grammar, precedes the ordering of constituents or the attachment of grammatical markers. The process that relates meaningful pre-lexical concepts in a well formed grammatical string thus controls the grammatical forms that appear as linguistic output: this is an assertion that the process of relating meaning is prior to structure rather than structure prior to meaning. Actual evidence for a pre-lexical structure is somewhat more difficult to come by, since all linguistic argumentation derives perforce from either "found" or "fabricated" examples from individual languages. Gruber restricts himself to examples from English and relies almost exclusively upon the idea that a "representational paraphrase may be revealing as to the structure of the base" (1976:272). Consider the following sentences. 1) John strikes Bill as odd. 2) John is regarded as odd by Bill. Each is seen as a syntactically distinct realization of an approximately similar pre-lexical structure. Non-paraphrase relationships he attributes to simple variations in the pre-lexical structure such as in the following. 3) John rolled the ball down the hill, (causative) 4) The ball rolled down the hill, (non-causative) Here the same lexical verb "roll" has a different function depending upon the lexical-syntactic context in which it is found. 12 The main points that Gruber elaborates in his work concerns (a) the nature of the verb in its relationship with the nouns of a sentence, (b) the "poly-categorial attachment" of pre-lexical concepts, and, particularly, verbal notions incorporating prepositions to lexical items, and (c) the relationship among nominals and the verb to explain the different syntactic roles that can be occupied by semantic roles such as agent, instrument, theme, etc. Gruber's first argument in favour of a pre-lexical associative process to generate lexical categories concerns the nature of the verb. Like other case grammarians he presents an analysis of the verb, considering it to be central to the semantic structure of the clause. He proposes that lexical verbs have a prepositional content that becomes "incorporated" into the verb as a single lexical item. The underlying prepositional component is important because it orients the action of the verb in relation to surfaces and space. In such a conception of clause semantics, nominal objects move relative to surfaces and space, and space is realized as a volume confined by surfaces. Gruber gives the example of the verb pierce which he says cannot be understood without also having at one's disposal, first, the concept of a surface, and second, the notion of movement represented by the preposition "through". It is only by means of the incorporation of the prepositional-motional concept "through" and the object concept of "surface" that the lexical entry "pierce" can be said to have any meaning. The pre-lexical categories of motion - through and spatial orientation - surface control the meaning of the verb and, as a result, control the syntactic-lexical contexts in which the particular verb may appear. For example we cannot say the following. 5) *He crossed around the world. "Crossed" contains the pre-lexical "schema" (from Rosch's vocabulary) of "on" realized as "a" and indicating a position relative to a surface, and "cross" which 13 decomposes as from one point to another point. Across thus contains the pre-lexical categorially restricted sense of going from one point to another on a (schematically at least) flat surface. Or, consider these sentences. 6) He went across the room. 7) *?He crossed through the room. In "He crossed through the room"; crossed describes the process of going over a surface, and through denotes passing within the volume of the room (space). Once again, through indicates a relation to surfaces incommensurate with the prepositional relationships incorporated into crossed. This process of lexically incorporating pre-lexical "structures" or schemas explains the meaning of the verb in a way unavailable to the transformational grammarians. In the transformational explanation of the relationship between pierce and through an ad hoc rule relating two apparently arbitrary concepts must be adduced. For Gruber, "the meaning of a word is to be characterized by the pre-lexical categorial structures which it may be attached to" (1976:20) and in this respect resembles other theories that reject transformations to "capture perceived relationships among different sentence types, either by mapping one "surface" structure directly onto another structure or by deriving related sentences from identical or similar abstract deep structures" (Starosta, 1988:59). From the lexicase perspective, for instance, "all grammatical rules are generalizations about the lexicon" (1988:59 ). There is thus no need for transformational rules to apply to syntactic nodes in an hierarchical "tree" described in terms of constituent relations. The notion of the polycategorial incorporation of pre-lexical structures mapping onto a lexical item and thereby constraining its possible syntactic environments is also consistent with the non-transformational nature of Mel'cuk's (1988) dependency grammar. In both 14 instances adjacent words are dependent and dependencies cannot cross categories as exemplified in the following. 8) He climbed up the tree, (prepositional phrase) 9) He climbed the tree up. (adverb) 10) He climbed the tree, (climb + up) two prelexical items incorporate in the lexical item climb In (9) the object the tree stands between the verb and the prep-adv up which cannot be perceived as being incorporated into the verb because of the intervening category. Restriction on the co-occurrence of certain Prep + NP's with specific verbs is handled easily if we see them as lexicalizations of pre-lexical structures. Consider the following: 11) *Mary received a book to Bill. 12) *Mary bought a book to Bill. If we think of receive and buy as lexicalizations of get from then we can see that a preposition such as to is blocked by its diagramming a schema inconsistent with the preposition "from" incorporated into the verb. The preposition, as Gruber conceives it, can be used for diagramming concrete or abstract situations. The "motion" described by verbs may be "concrete" or "abstract" without there being any effect upon prepositional incorporation. The point that I will make later is that, from a cognitive standpoint, there is no material difference between concrete and abstract processing, and further, that the processing of concrete information from sensory stimuli is anterior to abstract thought and determines the nature of abstract thought. It is a very special kind of semantics, then, that underlies syntax. The pre-lexical categories alluded to by Gruber derive from basic conceptualizations 15 such as "objects", "movement", and "space". Semantics, conceived here as the relationships between pre-lexical units, underlies the meaning of words and the co-occurrence restrictions in syntax. We may summarize as follows: i) Polycategorial attachment of pre-lexical components to individual lexical items obviates the transformationalist demand for one morpheme - one semantic interpretation. Unlike the transformatonalist system where lexical symbols have a real world counterpart, words in Gruber's system are not meaningful as individual lexicon entries but, rather, as representations of underlying categorizations that can take many "superficial" lexical forms. ii) The relationships obtaining among pre-lexical categories affects the combinatory possibilities of lexical items. The allowable structural pattern of the "surface" lexical string is predictable by uncovering the co-occurrence patterns in the pre-lexical structure. iii) The verb phrase can be decomposed into a motional and a prepositional concept engendering a specific relationship to surfaces and volumes; e.g., up, down, over, through, from (1976:249). iv) Apparent transformations are the result of the lexical attachment of pre-lexical categories in such a way that the underlying prepositional nature of the verb and dependent noun phrase is either suppressed or overt depending upon the voice of the sentence, as in the following examples. 13) Hans bought a book from Moira. 14) A book was bought by Hans from Moira. 15) Moira sold a book to Hans. The main focus of each clause is the obligatory undergoer of the verbal motion which Gruber calls the theme. In 13-15 the theme is book. If the theme is not topicalized to subject, then the subject will be a lexicalized underlying category 16 that would normally be the object of the preposition incorporated into the verb as in the following examples. 16) Ian sent a book su(source) obj(theme) 17) Moira su(goal) received a book obj (theme) 18) The book went from Ian su(theme) i.o.(source) to Moira. i.o.(goal) from Ian. i.o. (source) to Moira. i.o. (goal) In 16-18 the localistic notions source and goal are introduced which indicate that the suppressed prepositional phrases that occur when the theme is not the subject have their origins in a prelexical structure conceived in terms of a physical schema. 2.2 Fillmore Charles Fillmore concurs with Gruber that semantics underlies syntax and that the verb holds the central position in the "information structure" of the clause, but Fillmore along with Anderson, Jackendoff and Chafe is also concerned with providing transformational theory with a semantic foundation in an effort to explain in transformational terms the difference between the constant relationships that hold among "deep" semantic cases (the agent, theme, location) and the changeable "surface" syntax. In his now classic paper "The Case for Case" Fillmore explains how the relations that hold among the constituents of a clause may be founded upon cross-linguistically constant relationships among constituents in the deep structure of every language. The problem, as Fillmore explains it, is that surface case markers are neither consistently expressed within single languages with respect to semantic meaning - - nominative may express agent in an active sentence but patient in the passive equivalent of the same sentence - - nor do 17 all languages have case systems for signaling relations among constituents. Comparing languages that do have case systems, one language may have only a few distinctive case forms while another may have ten or more distinctive forms possible in the nominals. To add to the confusion in the study of what Fillmore and other generative grammarians call surface case, the number of distinctive case forms in a language can change over the course of history. These diachronic variations in surface forms provide evidence for the argument that surface forms do not adequately reflect the putatively cross-linguistically consistent underlying nominal-verbal relationships among clause constituents. Fillmore notes that changes in morpheme attachment to mark case-like notions in the history of a language "May well have been entirely in the economies of bringing to the surface underlying structural features which themselves underwent no change whatever" (1968:14). Surface cases, then, are not constant either cross-linguistically or diachronically; but neither does Fillmore agree with the transformationalist view of case that it is merely "an inflectional realization of particular syntactic relationships" (1968:14). He proposes that transformational grammar be altered to include the "conceptual framework" interpretation of case systems, similar to Gruber, in which underlying semantic relations furnish the motivation for "surface" forms but which are not perfectly mirrored by surface forms. If we adopt the case grammarian's point of view for the time being and accept that "surface" forms do not provide an uncomplicated transmission of information about the relations that hold among nominals and their controlling verbs, we must then ask what these cross-linguistically valid semantic roles and relations are and how they become manifest in the syntax that we use in the production of speech? Fillmore proposes that the clause consists of its tense-aspect condition and the "proposition" inscribed by the verb and its dependent 18 nouns. "Case notions comprise a set of universal, presumably innate, concepts which identify certain types of judgments human beings are capable of making about events ... who did it, who it happened to, what got changed" (1968:24). The proposition decomposes into the cases to which nominals are assigned, and the verb. Although we do not have to agree with Fillmore that concepts are themselves innate, there does appear to be significant agreement from both linguistics and cognitive psychology that the non-stop flow of sensory information is segmented, identified taxonomically, and slotted into pre-existing categories. Wallace Chafe, in Meaning and the Structure of Language states that, for the purposes of an investigation into semantics and its effect upon syntax, "the total human conceptual universe is dichotomized initially into two major areas. One, the area of the verb, embraces states (conditions, qualities) and events; the other, the area of the noun, embraces "things" (both physical objects and reified abstractions)" (Chafe, 1970:96). From the two initial divisions all other taxonomies may be derived. What is notable, however, is that while the number of items in a language that refer to things is theoretically infinite, the number of classifications to which things can be attached is quite limited. The semantic cases agent, and patient, Chafe claims are most closely associated to the verb and are central to the meaning of a sentence. Some other semantic cases are as follows. 19) (Experiencer) Tom likes watermelon. 20) (Beneficiary) Tom has the tickets. 21) (Instrument) Tom cut the cake with a knife. 22) (Complement) Mary sang a song. 23) (Location) The cat is on the mat. 19 Instrument and location are "peripheral" to the main proposition which consists of predicator (action or state verb) and argument (agent or patient noun). According to Chafe, seven cases account for all nominal concepts in English, and, while the exact number of cases is not agreed upon in the literature, all the linguists mentioned so far agree upon the basic notion that a pre-linguistic segmenting of sensory experience occurs which later determines the shape of syntax. In the 1968 version of case grammar Fillmore says that the nominals in a clause determine the type of verb that may be present - what he refers to as the "case frame". Such an idea presupposes a particular serial order to the conceptualization of events. The nominal participants are identified prior to the verbal notion that sets the nominals in motion relative to each other. In "The Case For Case Reopened" (1977) Fillmore revises his thinking in line with Gruber, Chafe, Anderson, and Jackendoff who agree that the verb dictates the case notions associated with the nominals. Fillmore states in his revised theory that the case frame "indicates the case notions conceptually present in a sentence" (1977:64). This verbal context in turn provides a relational structure for the nominals. The cases proposed to exist in the deep structure in the 1968 paper are agent, instrument, dative, factitive, locative and objective, although this list is tentative and Fillmore is unable to say whether there are more or fewer cases. Cases in the "deep" structure do not necessarily have morphological realizations; they may be deleted and they may occur in different surface forms according to the dictates of the verb. Fillmore suggests that "deep (semantic) structures" become realized in the "surface" structure in ways that maintain the universality of the semantic relations of the base while accommodating cross-linguistic and diachronic variation. "Suppletion, affixation, addition of 20 prepositions, registration of particular elements in the verb, subjectivization (topicalization), objectivization, sequential ordering, and nominalization" all account for the variation in overt forms (1968:48-49). We need not go much further into Fillmore's case grammar except to note that the main thrust of his argument, like that of Gruber and Chafe, makes syntax a result of, and a reflection of, the processes that occur prior to specific lexical and grammatical attachments taking place. He ends by doubting the need for transformational grammar's "deep syntax" which he says is an "artificial intermediate level between the empirically discoverable "semantic deep structure" and the observationally accessible surface structure" (1968:98) and is as much an artifact of the transformationalist concentration on syntax and exclusion of semantics as it is of any postulate necessary to a theory of universal grammar. However, we should note that Fillmore adumbrates the next phase in the search for a universal basis of grammar as set out by John Anderson in The Grammar of Case by his revival of an observation apparently first made by Bernhadi in 1805 (1968:15) and echoed by Gruber that there is a connection between prepositions and cases: Prepositions in English - or the absence of prepositions before a noun phrase, which may be treated as corresponding to a zero or unmarked case affix - are selected on the basis of several types of structural features, and in ways that are exactly analogous to those which determine particular case forms in a language like Latin. (1968:15) As has already been observed in the work of Gruber, prepositions orient nominals with respect to surfaces and volumes, and have a role in the semantic attributes of the verb. Chafe carries on the idea that prepositions play a role in the semantics of the verb and adds that both prepositions and adjectives can be thought of as verb roots in instances where the verb be is used only to carry 21 tense and other inflectional information that cannot be carried by either prepositions or adjectives. "The knife is in the box" demonstrates that the preposition carries the transitive information concerning the location of the knife. Chafe goes on to say that non-stative verbs may be locative only if they co-exist with a "verb root" preposition. In such a way, the non-stative "throw" in "Tom threw the knife into the box" (1970:162) becomes a locative verb. The point here is first, that prepositions have verb-like attributes in their relations with nominals, and second, prepositions are always locative in nature both in their derivation and in their activities with respect to nominals. When Fillmore demonstrates that prepositional attachment to nominals changes depending upon which nominal is "promoted" to subject, he also demonstrates the locative nature of the dative (to John). Here is Fillmore's example of a sentence containing modality - tense, and the cases locative and agentive. V L A K A A Past P>«\cV\ 6 » \ -t^e. t>o«,c t o 3oV\«v fevj ^ * r ^ J Figure 1: TREE DIAGRAM Adapted from Fillmore (1968:68) 22 Each nominal is governed by a case marking preposition which may be deleted depending upon the focus of the sentence produced from the basic conceptual pattern containing verb, agent, theme, and patient, as in the following. 24) Mary pinched John on the nose. 25) John was pinched on the nose by Mary. 26) John's nose was pinched by Mary. Fillmore puts the dative case under the locative heading even though the movement described is abstract in nature. In the surface manifestations of the underlying structure, "to John" does not show up at all unless we accept a paraphrase sentence such as: "Mary gave a pinch on the nose to John" in which "pinch" is nominalized. 2.3 Anderson John Anderson in The Grammar of Case presents the localist argument in favour of there being an underlying semantic basis to grammar based upon the semantic primitives of movement and location. He seeks to identify the attributes of the underlying semantic composition of language that causes there to be "a complex relationship between the underlying semantic (case) relations and their superficial markers (case inflections, prepositions and postpositions) . . . " (1971:8) in which common principles underlie spatial, abstract, and syntactic cases. Following Gruber's and Fillmore's lead, Anderson proposes that the sentence decomposes into a verb, and one or more nominals depending on the "valency" of the verb. In other words, the verb requires specific nominal actors depending upon the relational attributes of the verb. The verb is conceived of as the force that relates nominals to each other with respect to location and direction. As with Gruber's system, there is no difference between concrete and 23 abstract location or between concrete and abstract direction. Anderson states that: Even in such (for the most part) non-localist discussions as Kurylowicz's (1964:ch.8) concerning the Indo-European case-system, the intricate superficial and historical relationships between the representation of "concrete" and "abstract" uses are well illustrated - and demand an explanation. A localist conception of case inflexions (and prepositions) and case functions provides in principle an explanation for such, as well as . . . for various other synchronic and diachronic semantic and syntactic phenomena. (1971:10) "Surface" syntax he suggests may be a "neutralization" of the basic underlying cases in that "surface" cases may represent combinations of pre-lexical conceptual structures. It is important in this context to understand that structures are not co-extensive with rules, nor are they a ready made scaffolding to which conceptual bricks and mortar may be added. Pre-lexical conceptual structures come to be only by means of repeated successful attempts to delimit experience. To delimit experience means to identify similar experiences and to group like with like, to categorize and to sub-categorize according to a discrimination process that causes a taxonomic hierarchy to become available over time. The grammar that we use to express meaning by way of the linear ordering of lexical items is not a syntactic structure but a reflection and an instantiation of the conceptual organization that has taken place in order to produce linguistic behaviour. Anderson sets out: to show that sentences involving various non-spatial relations can plausibly be considered to involve (semantically and syntactically) locative or directional structures, and that they differ from "concrete" locatives not with regard to the basic case relations involved but in the character of the nouns and (particularly) the verbs that contract the relations. (1971:11) 24 The ultimate goal of Anderson's work within the localist framework is to provide evidence that "not only are there common principles underlying spatial and non-spatial cases, but also (as is implied by the preceding remarks) the spatial variant has ontological (and perhaps chronological - both short and long term) priority" (1971:12-13). Anderson attempts to substantiate his localist hypothesis within a dependency grammar where the verb governs the nominals of a clause and where the "nominative" (usually called the object or accusative) is a required component in every case frame. As in Gruber, a single lexical item may be a "lexicalization" of a more complex set of concepts. Anderson gives the example of "walk" in which an underlying adverbial specifies the particular mode of travel incorporated into the verb. The paraphrase "travel on foot" contains the general category of movement "travel" and the specific mode "on foot" (1971:16). "Travel on foot" and "walk" thus would have a common source which supports the assertion that the semantic component precedes the selection of the lexical items appropriate to the syntactic environment. For instance, if "traveled" is used to indicate walking, then "traveled" has to be accompanied by "on foot". If "walked" is used then "on foot" is absorbed into the meaning of the verb. Thus "He traveled on foot" and "He walked" have comparable semantic precursors or, in Rosch's nomenclature, they are derivable from the same central prototype. In typical case grammar fashion Anderson states in his outline of grammar that any noun phrase (NP) must be accompanied by a case category. Such a requirement by its nature imposes a categorization process upon each NP. The verb governs NP's directly without any other intervening non-terminal constituents. Instead of constituency, as in transformational grammar, categories are governed by the meaning of the verbs. A clause must be semantically well formed in order to be grammatical, and semantic well 25 formedness depends entirely upon the correct alignment of prelexical categories. The relational notions attached to case are carried via the terminal category (preposition) (1971:29) even though the verb determines what kind of case may be present in the clause. NP's thus are not constituents situated below case in a hierarchy, rather, they are the "thing like" entities that are given spatial and locative direction via the "function" effect of the case as it is represented by the preposition. Anderson summarizes that: Pre-terminal categories have been eliminated and in place of the constituency relationship, the categories are "hierarchized" with respect to dependency. Loc. and nom. are dependents of V (which thus governs them); and they each have dependent on them (i.e.: they govern) a N. Thus the case elements can be interpreted quite naturally as expressing the relation contracted between their dependent NP and the governing V. . . . (1971:30) In the final chapters of The Grammar of Case Anderson presents the "extreme" localist hypothesis that all semantic case notions are extensions of the locative. He distinguishes between what he calls the "information structure" of the clause, and the "cognitive structure" that creates the conditions anterior to the attachment of lexical and grammatical elements that symbolize the cognitive structure. Hitherto, the semantic cases had been four: nominative, ergative, locative and ablative. Nominative we can equate with the "theme" in Gruber or "object" in Fillmore. The nominative is the receiver (goal) of the action of the verb and can be superficially either subject or object; ergative (similar to agent) is the initiator or source of the action and can be subject or ablative as in the following. 27) He read the book. Su-erg obj-nom 28) The book was read by him. Su-nom abl-erg 26 Locative and ablative indicate either concrete or abstract location or movement as in the following. 29) The apples are in the box. loc-concrete 30) The truth is known to many people. loc-abstract 31) He came from church. abl-concrete 32) He bought a book from Mary. abl-abstract These four semantic cases are employed in various ways depending upon the requirements of the verb to relate the nominal elements to each other in a manner that conveys the information content of the clause. Anderson proposes that the more "syntactic" cases, that is, the two cases that are most closely associated with the meaning of the verb, the nominative and the ergative, are, respectively, extensions from the locative and the ablative which specify spatial relation and location rather than direct participation in the verbal state or action. The evidence that nominative is derived from location and ergative from direction involves the idea that there are notional parallels between these cases. Nominative is a goal while ergative is a source as in the following example. 33) Mary helped anyone who asked. Erg-su nom-obj If we nominalize the verb and thus separate out the prepositional content of the verb we can re-write the sentence as follows. 34) Help was given to anyone who asked by Mary. nom-loc erg-abl Mary is the source of help and anyone is the recipient and therefore the location of that help. Compare also the Old English version of 27 he helped him: 35) He nine geholpen he to him helped The case marker clearly shows a locative-dative notion. In addition, ablative and by extension ergative, can be conceived of as locative by comparing the following sentence. 36) He came by way of Seattle. The prepositional phrase by way of Seattle shades from a purely locative to an instrumental meaning. Or consider the following. 37) The book was bought by Mary. Here Mary is a facilitator of the transaction - - instrument and agent, and a recipient location. Anderson concludes that, ultimately, the manipulation of locative either as + locative or - locative can account for the attributes of nominative, ergative and ablative. 28 CHAPTER 3 Cognitive Grammar and Categorization 3.1 Case Grammar and Cognitive Grammar I have belaboured the points raised by "case" grammar first because of the impetus it has given to the notion that semantics informs syntax, and that semantics and syntax form a continuum with no clear demarcation. And second, because the localist theory, in particular, presents one way of explaining the predominance of prepositional case marking in English in which abstract nominals are marked by "concrete" spatially motivated prepositions. Third, the localist view of case marking, wherein all cases have their origin in spatial notions of direction and location, conjoins particularly well with the grammaticalization hypothesis presented by Heine and Claudi, among others, stating that grammatical markers in general derive from physical object schemas which are progressively abstracted by means of "conversational implicatures" to express the abstract concepts pervasive in language. I will argue further that abstract concepts are processed in the same manner as the concrete physical object schemas from which they are derived. The provisional conclusion to be drawn from this position is that "abstract" relations such as those we discern in case relations, while they may be extensions derived metaphorically (Lakoff) or metonymically (Brinton) from concrete prototype/schema structures, are, first of all, analogues of concrete physical experience of the world. The abstract does not exist independently of its concrete antecedents. Although non-physical experiences can be named and so talked about, they can only be brought into linguistic existence by means of 29 the patterns established by concrete spatial and kinetic experiences. In the following section I will present the major themes in cognitive grammar that set it apart from the generative paradigm and which, I believe, go a long way to justifying Anderson's and Lyon's claim that concrete spatial relations are anterior to and models for "abstract" grammatical relations. Here is a summary of the ideas to be developed in the following sections. i) The experience of being an actor in a physical environment that can be differentiated from one's self precedes abstract thought. ii) Interaction with the physical environment is the first instance in a chain of inferences wherein concrete experience is used as a model for the manipulation of abstract concepts. iii) Categories created from concrete experience, especially tactile and visual experience, form the basis for the image-schematic precursors to conceptual thought. iv) Repeated physical experiences become entrenched as patterns of neural activation. v) Entrenched patterns of neural activation come to represent significant patterns in the world. The instantiation of a pattern depends both on the nature of the world and on the nature of the perceiver. vi) In conceptual thought, image schemas created from concrete experience are represented by symbols. vii) Grammatical elements symbolize, linguistically, the categories of experience that develop from, and are metaphoric extensions of, physical experience. viii) The point at which physical experience becomes entrenched as a pattern of neural activation marks the beginning of abstraction, because physical experience and neural activation are different in kind. But physical 30 experiences, as they are registered in the mind, and abstract concepts are, neurologically speaking, made out of the same "stuff" and, therefore, abstract concepts can be treated as if they were physical objects while physical object schemas can be treated as if they were abstract concepts. 3.2 Prototypes and Schemas The major claim of cognitive linguistics is that language is the outward evidence of categorizing going on at the pre-linguistic level of mental processing, a claim that takes us back to the theories of Gruber and pre-lexical categorial attachment. Langacker states that "grammar consists of patterns for combining simpler symbolic expressions to form progressively larger ones" (1988:148). The elements of language, however, are symbolic only to the extent that they represent the categorizing functions that occur prior to language production. The symbols of language are not processed as symbols but as instructions for the specification of a restricted set of schemas, or paradigmatic relationships, that may occur. Langacker says that: The schema describing a pattern of composition is not itself responsible for actually constructing an expression. Instead it serves a categorizing function: it furnishes the minimal specifications an expression must observe to be categorized as a valid instantiation of the pattern it embodies. (1988:132) Schemas are built up from the experience of repeated specific instances and are abstractions from such repetitive experience. "Schema", as used by cognitive linguists, and "prototype", as it has been described by Eleanor Rosch (1978), are often used interchangeably, and there is a good deal of confusion concerning a standard nomenclature for their use. Perhaps the easiest way to reconcile the two is to think of the schema as a "special instance of prototype construction" (Langacker, 1988:134). A schema 31 can be thought of as representing stereotypical relations among basic level categories. Rosch identifies different levels of abstraction in the categorization process. Basic level categories account for the most commonly encountered natural discontinuities that we use to create an internal representation of our environment. Perceptual information is not randomly assigned to a category but, rather, according to functional utility. Commonly encountered items such as chairs, tables, lamps, are identified, and prototypical examples established, prior to a super-ordinate category such as "furniture" being established. The common object, of "average" (for a person) size, that is interacted with constantly, and can be identified both visually and from kinesthetic interactions such as sitting on, leaning on, eating off, was most easily identified as a category type by the subjects in Rosch's experiments (1978:27-48). An entity may be considered a central or peripheral member of a category depending upon how closely it is judged to resemble a prototypical instance of that category; there is no theoretical outward limit to what might be included in a category. Taxonomies of greater inclusiveness such as furniture, which includes tables, chairs, and lamps, etc. or animal, which includes dogs, cats, rabbits, mosquitoes etc. did not generate prototypical images for Rosch's subjects as quickly or as easily as did the "basic level" categories. An image of a prototypical chair and prototypical table are easier to imagine than an image of prototypical furniture. Furniture is an extension from the basic category and is a level of abstraction away from the most commonly experienced versions of furniture. In the opposite direction, more specific divisions of a basic level category can be made. The basic level category of dog, for example, can be further subdivided into Pekineses, golden labs, pugs, etc.; since specific breeds are encountered before the prototypical dog is 32 constructed as a visual-kinesthetic image, the basic level category must be more abstract than the specific breed which is encountered in concrete experience. However, the discontinuities between pugs and Pekes and labs do not appear to be functionally significant and so are not perceived as significant differences when the prototypical dog is constructed. The perception of ever finer gradations in taxa is always possible of course, up to the point where every instance of a category is classified individually, for example, labs with reddish-brown fur, white fur, three legs, a thorn in one paw, in two paws, etc. But as Rosch points out: one purpose of categorization is to reduce the infinite differences among stimuli to behaviourally and cognitively usable proportions. It is to the organism's advantage not to differentiate one stimulus from others when that differentiation is irrelevant to the purposes at hand . . . What attributes will be perceived, given the ability to perceive them, is undoubtedly determined by many factors having to do with the functional needs of the knower interacting with the physical and social environment. (1978: 29) The prototypical instance of a category such as dog finds its defining characteristics in a viewer's interaction with dogs: four feet, wet nose, sociability, tail wagging, inability to follow a path without frequent detours; all these aspects of viewer interaction with dogs contribute to the construction of the image of a prototypical dog. Prototype systems contrast with the objectivist belief that categories can be described by criterial features where either an item is a member of the category or it is not. Membership in a prototypical category rests critically on the judgment of the perceiver. According to Rosch, category systems are a cognitively natural way to perceive the world. A categorizing perceptual system has the advantage of providing a maximum amount of information concerning 33 aspects of the environment that are necessary to know about, while requiring a minimum amount of discrimination and storage effort. Categorization provides predictability and uniformity in information about the world by structuring categories of objects and events according to criteria that are meaningful to the perceiver of objects and events. For categories to be maximally useful and meaningful, the category must be an adequate reflection of the way the world is actually perceived, though not the way the world may be outside of the perspective of the viewer. Categories must be useful to a viewer; it is not "objective reality" that is being mapped but the experience of reality. Categorization is an activity that encompasses both objects and their relations. Basic level objects consist of a set of members that are more or less consistent with a prototypical central image of that category. Relations among prototypical objects organize into patterns that become habitual through kinesthetic and visual experience. It is therefore possible to speak of both object level and relational level prototypes where prototypical objects engage in prototypical relations. Categorization of basic experiences thus enables the continuous flow of sensory information to be segmented according to its functional similarity to other experiences to be related in restricted ways to other experiences which have also been categorized. Such restrictions entail the segmenting of experience into stereotypical scripts or scenes (Mandler, 1984). In summary, the use of categories reduces the amount of perceptual information that needs to be attended to by separating perceptual information into functionally useful and functionally irrelevant information. The prototypical center of a category represents the most salient aspects of functionally relevant perceptual information while the inclusion of peripheral instances reduces the number of categories necessary to make an inventory of the perceptual and social environment. Category prototypes seem to form in order to maximize the 34 apparent "chunkiness" or discreteness of the environment and to affect functional decisions about the environment according to the manner in which perceptual information is segmented. 3.3 Child Language Acquisition, Prototypes, and Grammaticalization Rosch speculates that, "in the evolution of languages, one would expect names to evolve first for basic level objects, spreading both upward and downward as taxonomies increased in depth" (1978:35). This ontogenesis of language is reflected in the way children learn language: because perception, motor movements, functions, and iconic images would all lead to the same level of categorization . . . basic objects should be the first categorizations of concrete objects made by children. (1978:34) Studies cited by Miller and Johnson-Laird (1976) support the idea: that the notion of hierarchical class-inclusion may itself be derived from the psychologically more primitive concept of locative predicates . . . Correlatively, language development must wait upon the development of the requisite perceptual routines, which may require movement within and action on the material world. (Boden, 1982:132) In child language acquisition basic category names for objects and actions within the immediate physical environment do seem to be acquired first. Dore (1985) suggests that a child's first words are not "words" in the sense of being denotive symbols within a system of symbols, rather, they are indexical signs "applied to some aspect of immediate context" (1985:33-34). Similarly, Piaget states that first words are entirely context induced and "function as part of the activity" (1945:237). A child cannot, at first, report what someone else has said. Words are not conceived as objects that can be manipulated within a linguistic context; they are instead perceived first as integral parts of concrete and social contexts. In a study of how children develop the ability to concatenate words as symbols Sinclair states: 35 Our data suggest that at first, words are not dissociated from the elements of this complex: The child's communicative intention, his own inner state, the situation he talks about, the moment of speaking, and whatever response he hopes for from his listener are not differentiated. When such differentiation does occur: Such differentiations can be seen as similar to those between acting subject, object acted upon and action which takes place during the sensori-motor period and which, according to Piaget, culminated in a differentiation between the object and the subject's action, leading to the permanence of objects, and the differentiation between the subject and his action, leading to a view of one's self as an agent among many other possible agents. (1989:10) Heine, Claudi, and Hunnemeyer call the basic level categories and the schematizations relating basic level categories the "source concepts" for grammaticalization. Child language acquisition and the adoption of an existing lexical item to serve a grammatical function have in common the utilization of these source concepts, and, it is at the point where concrete experience becomes "semiotic currency" that we find child language acquisition and grammaticalization have their similarities. Both require symbolic resources that can reconstitute the conceptual structures used to generate a functional and stable mental model of the external concrete-social world. Beginning from the individual words learned by the child in specific concrete-social contexts, an organization of lexical items is built up that treats lexical items as objects within a textual, or discourse, domain. Grammatical structures, from this perspective, are characterized as the "frozen result of conceptual manipulation and conversational implicatures" (1991:221). 36 3.4 Grammar and Concrete Experience While the preceding has been a somewhat abstract characterization of prototypes and schemas in which I have attempted to give some general idea of where they come from and what they contribute to the conceptual structuring of the world, and to intimate their importance for the structure of language, in the following section I would like to be more specific and show what kinds of concrete experiences with objects generate the restricted set of schemas that function as relational concepts within language. From there it is possible to show that the "dine of grammaticalization" from lexical preposition to case marking function is an instance of prototype motivated polysemy (multiple related meanings) in which a target domain is structured in terms of its source domain, which involves the structuring of abstract entities and relations by means of more concrete image schematic percepts. According to Lakoff and Johnson (1980:162), categories have "natural dimensions", that is, they become discrete entities as opposed to undifferentiated perceptual input because an observer or participant with a unique perspective within a situation acts upon the perceptual input. They state that the perceptual, motor-functional, functional, and purposive nature of human interaction with the environment forms an interactional "gestalt" that defines properties, not in themselves, but as they relate to the observer interacting with the environment. Drawing on the work of C S . Peirce, Lakoff and Johnson say that to categorize something is to focus on some aspect, while hiding or diminishing other aspects. We focus on whatever property is most salient at the time. For this reason, the perspective of the participant or observer plays a critical role in the characterization of scenes and the categorization of people, objects, and events. 37 A cognitive routine does not emerge ab nihilo. It represents the experiences that an observer has undergone previously, and it represents the categorizing judgments, or automated categorizing operations, that the observer has applied to those experiences to make them conform to a "theory" of the world in which discrete entities engage in repetitive interactions. Johnson (1987:21) illustrates how basic experiences with our own bodies can be abstracted to form the basic cognitive structures used to supply situations with meaning. For example, our bodies are, arguably, the first things that we experience as containers with discrete physical boundaries existing in three-dimensional space. We put things into our bodies, our bodies hold things, and our bodies are themselves contained within other containers such as rooms, beds, baths, etc. We also become aware very early in our existence that we can manipulate objects and place them in containers. Johnson states that: In each of these cases there are repeatable spatial and temporal organizations. In other words, there are typical schemata for physical containment. If we look for common structure in our many experiences of being in something, or for locating something within another thing, we find recurring organization of structures: the experiential basis for in - out orientation is that of spatial boundedness. (1987:21) Experience of one and two-dimensional enclosed space is also used to compose meaning structures or schemata; being in a circle or in a line are examples of two and one-dimensional containment respectively. However, as Johnson points out, they are restricted to ideas of separation and differentiation more than containment. Lakoff and Johnson (1980) give considerable importance to the consequences that flow from the experience of what, in effect, are basic level concrete experiences. Johnson elaborates saying that "It is a matter of great significance, as I argue later, that patterns such as these, which exist pre-38 conceptually in our experience, can give rise to rational entailments (which we describe propositionally) (1987:22). The container schema is just one of several abstract schematic structures that are the result of basic experiences. These schemata include (I) the basic object-container schema entailing object as a category and container as an extension of the object category (2) the directed path, (3) link, (4) part-whole, (5) source-goal and (6) force. Each of these schemata derive from experience with the manipulation of objects. Such experience occurs, by necessity, in a spatial domain where both kinesthetic and visual perceptual input are used to produce an image-schematic representation of the participants and the field of actions. Both visual and kinesthetic input can form abstract schemata on their own, and Johnson cites a study in which blind subjects performed operations using abstract schemata in a manner similar to sighted subjects. The visual and the kinesthetic modes of perception seem to be processed in a similar fashion and the information from one area appears to be readily available to the other area so that subjects in experiments by Shepard and Metzler (1987:25) were able to perform operations of "mental rotation" on two-dimensional visual images as if they were three-dimensional objects. Johnson regards "these phenomena as evidence for the thesis that mental operations on image schemata are abstract analogues of physical processes or operations ... It is as though we have a "mental space" in which we perform image-schematic operations ..." (1987:25). 3.5 Object as a Concept that Informs Grammar In each of the schemata proposed to encompass a major portion of our conceptual world the object is a fundamental component. We begin our lives not able to differentiate ourselves from our environment; then, we discover that 39 we are objects different from other objects in our perceptual field. In the first instance, the container schema, an object is a necessary part of the complex concept of containment. Containment requires an oby'ecf to be contained, a container with sufficiently resistant boundaries to contain the object and a force of some kind that causes the object to be kept in or put into the container. In the directed-path schema, an object is propelled along a line by a force towards a goal. If the object is being propelled by the person who is also perceiving the situation, then he connects the application of force with the intention to apply force. The intention to apply a force is linked to causation. The object is caused to move along a path from a beginning, or source, to an end, or goal by an agent who possesses intentionality. The prototypical agent, then, is the source of force and of intention to apply force in a particular direction or vector. The agent, as the source of the force to cause objects to move along paths, is at the centre of a centre-periphery schema that develops from the ego-centric nature of perception. The centre-periphery is, therefore, the basic orientational schema and is, as Heine, Claudi and Hunnemeyer explain, the source for orientational spatial concepts expressed lexically. The first instances of lexically expressed spatial notions, according to Heine et al, are either landmarks such as sky, mountain, etc. or more commonly ego-deictic, self specifying nominals. With examples drawn from the African language Ewe they argue that language structure follows from cognitive prototypes that are extended in an order relative to the "distance" from a concrete prototype : ego-deictic-person > object > process > space > time > quality. For example, back is originally a nominal specifying a body part. From being an ego-deictic specification of a location on the body it is abstracted to become a location on any object. From being a location on any object it derives a motional connotation where an object moves toward another object against 40 the direction of normal prototypical forward motion. In a sentence such as "He backed against the wall." there are nominal and verbal aspects in the underlying conceptual complex. For back to be a lexical element denoting both a motion and a concrete location in a spatial domain, the concrete location must be conceived first as an entity that can be scanned within a field of view that can be separated from other body parts; and it must be conceived relative to some other body part used as a landmark. The separation of one body part from another initiates the process by which space is hypostatized; that is, a real identity is attributed to a concept that derives its existence from the pragmatic segmenting of non-discrete perceptual phenomena. Space is created by the identification of objects in "perceptual" space. Heine et al observe that, from their research in African languages, prepositions denoting "static" relations, or reference points such as on, in, at, [at]back, [at] front are almost always derived from body part nominals (Svorou, 1988). Motional concepts such as to (goal, benefactive, dative), toward, along (path), from (source, cause) (Heine, et al, 1991:140) more often derive from verbs; but it is doubtful that the motional concepts exist without the concept of concrete objects in motion being conceived first. To return to the main point here, the relationships that prototypical objects may engage in are schematically structured from basic experiences with objects both visually and kinesthetically. Basic relational schemas clump many different experiences of objects in motion together in the same way that basic object categories clump members of a class together depending on their perceptible similarity to a prototypical central member. The categorizing function is also a generalizing strategy by which the unknown can be subsumed within the known. In the development of linguistic ability, the known is the concrete, named world and the relationships that can occur between objects in 41 the known world. The unknown includes much of the socially experienced world and the relationships that hold among namable but non-concrete entities. The schemas of physical experience become meaningful by means of the projection of the intentionality of the acting subject into the objective patterns seen in the world. For example, as already noted briefly, if an object traverses a path, an observer projects himself, as the possible agent or initiator of that movement, into the physical schema and so imbues it with the concept of causation; someone causes things to happen. Forces are applied by conscious agents in an attempt to achieve a result. A result is the end point of the trajectory traveled by an object that has been caused to move by a conscious agent. A physical endpoint is by extension conceived in terms of the satisfaction of an intention. An intention is thus a physical force and a physical endpoint is thus an intentional, or abstract, goal. 3.6 Case Relations Follow Concrete Schemas The relationships that exist among the elements of a clause exist by fiat from the extensions from an ego-centric model made by the participant-observer of events in the visual or kinesthetic field of action. Johnson calls these extensions a metaphoric mapping of the structural relations of a source domain onto a target domain. He states that: The metaphors, or analogies, are not merely convenient economies for expressing our knowledge, rather, they are our knowledge and understanding of the particular phenomenon in question. (1987:112) While Johnson is interested primarily in the way that a concrete source domain can supply structural analogies for a less understood target domain, such as the "electricity is flowing water" metaphor, I believe that the same principle holds for the structuring of grammatical relations by means of the concrete schemas that 42 have been imbued with observer oriented meanings. These "grammatical domains" are as diverse as case, relative pronouns, deictics, anaphors, and temporal relations. For example, when a path is connected to a desire or intention at the start or "source" end, and the satisfaction of that desire at the other "goal" end, then any intervening events occurring "before" the "goal" is reached are construed as events in a temporal domain. Before and after are conceived first as physical points on a line, and, from the sequence encountered in physical experience, are given a temporal interpretation. Time expressions are always expressed according to some form of spatial metaphor such as, "before it happened", "after he arrived", "look into the future", "time is passing". In each spatial expression used to denote time, the observer (or speaker) assumes some kind of relativized position commensurate with his perspective on the situation. In "time is passing" the observer is standing in one place and time is either an object or a pervasive environmental presence going by the observer. In "look into the future" time is an object with discrete boundaries. In "let's face that when we come to it" time is a line with temporally demarcated objects on it that the observer encounters as he moves facing forward along the line. If time is conceived in terms of a unique observer perspective in space relative to other objects (space being a hypostatized entity extended from the experience of objects) then all temporal aspects of grammar are grounded in the spatial concepts that first license the use of temporal expressions. It is reasonable to conclude, as a result of both space and time being concepts of relativized observer perspective, that grammatical morphemes utilizing spatial and temporal concepts are symbolic representations of the basic conceptual model of the world that derives from primary, ego-centric experiences with objects. 43 I propose, therefore, that case relations are observer motivated spatial concepts imposed upon objects and events conceived as objects. The spatial motivation for grammatical markers provides not only an explanation for the widespread use of markers derived from spatially motivated nouns and verbs but also for the pervasive apparent polysemy among prepositions both synchronically and diachronically. Apparent polysemy turns out to be a case of prototype extension and completely motivated by the use of physical schemas for abstract grammatical constructions. In the versions of case grammar described in Chapter 2 there was a distinction made between "surface" case markers on the one hand, and deep semantic cases on the other. It was generally agreed that surface case markers were an unreliable guide to the "semantic" roles of agent, patient, beneficiary, experiencer, etc. because there is no one to one correspondence between semantic role and "surface" case features such as word order, prepositional marking or affixing. The cognitive grammar approach to case is quite different; in cognitive grammar there are no transformations and no "deep" semantic structure separable from the surface strings of lexical elements that are symbols for the pre-linguistic organization of concepts. The cognitive models experienced as prototypes and schemas that provide our conception of the world also provide the prototypical values of clausal elements. It is our conception of events that shapes clause structure and so the markers used to indicate clause structure. According to Langacker, language is a problem-solving activity. The problem for a speaker in a speaker-learner interaction is how to represent the conceptual gestalt that he has of a situation in the form of a symbolic concatenation. The phonological form of a symbol is an arbitrary or accidental sedimentation from the history of the language; but the conceptual relationships 44 that occur among forms are motivated by the pre-linguistic schemas that develop from experience with the material world. Individual schemas, which are sedimentations of complex experience, are, in turn, used to construct a unitary conceptual image of a situation which is referred to as a gestalt structure. Because linguistic convention cannot provide a fixed, unitary expression for every conceivable situation, it furnishes a limited inventory of fixed expressions, which are generally appropriate for coding only certain aspects of complex conceptualizations, together with a set of conventional patterns for combining the conventionalized expressions. A unified complex concept (1987:278) is dissociated into conventional units and integrated by conventional patterns of combination to yield a composite pattern in the mind of the listener that has the potential to reinstate the original complex unitary concept. Diagramatically this process would proceed as follows: unified conceptualization --> linguistic symbolization --> reintegrated conceptualization. Schemas restrict the number of construals a situation may have and, by restricting possible construals, allow for the schematization of typical or prototypical situations in terms of conversationally determined conventional imagery using conventionalized symbols. 3.7 The Figure/Ground Schema In order to have a mental model of the world, (Johnson-Laird, 1983) certain elements are picked out or highlighted as being the most salient for the person perceiving the situation at hand. The structure of the clause reflects the use of this schema for "picking out" salient features in the environment from the background. In each level of clause hierarchy the figure/ground schema is expressed in terms of trajector/landmark configurations 45 where the trajector is an especially salient feature at the head of the construction, and the following landmark is in some sense dependent upon the trajector. In a subject/predicate construction the subject stands out as a figure or focus of the construction. Langacker states: A figure is defined by contrast with its surroundings, and the likelihood of an entity being construed as figure is enhanced to the degree that the contrast is sharp and the entity discrete. (1987:236) In addition to the trajector/landmark profile, determinant/elaboration, and component/composite distinctions, all of which contribute to a fuller picture of the figure/ground schema, Langacker makes a distinction between conceptually dependent and conceptually autonomous symbolic structures. Nominals are those things, both concrete and abstract, that have an existence quite apart from the relations in which they might be involved. Langacker calls nominals autonomous concepts that must be conceived first in order for relations to exist. He argues that: any relational notion can be regarded as conceptually dependent, since it requires for its conceptualization some intrinsic reference - however schematic - to the entities that participate in the relation. . . . One cannot conceptualize the process of chasing without conceiving . . . the thing doing the chasing and the thing being chased. (1987: 299-300) The concept of object, then, seems to be anterior to the concept of object movement relative to other objects and so lends support to the view that only the experience of concrete objects is a necessary precondition to the notions of space, movement, and time. The subject is selected from an encyclopedic array of nominal concepts and, as discussed earlier, can be either a concrete or an abstract notion. In either case it is treated as if it were a concrete entity within the scene setting apparatus of the clause. The verb designates processes that occur 46 sequentially, and are "scanned" as if the process were a series of atemporal conditions occurring in time as in a series of locative configurations (1987:254 ; 1991:289). In the subject/predicate construction the subject, as the focused upon element, determines what predicates may occur with it. In this respect, Langacker disagrees with valence relation theory which states that only verbs determine what nominals may occur with them. In the verbal predicate structure, however, the verb is the figure and the rest of the predication is the ground. In such a structure the verb is the profile determinant (1987: 289) in that the remainder of the predicate elaborates the meaning of the verb, just as the predicate as a whole elaborates the meaning of the subject nominal. By such means, both nouns and verbs can be profile determinants in a clause. A clause is thus a composite structure made up of smaller hierarchically structured relations: nominals and their elaborations, verbs and their elaborations. 3.8 Prepositions and Conventionalized Object Schemas As we have seen, a grammatical structure has its foundations in the pre-conceptual figure/ground schema that is a member of a restricted set of object/movement schemas. Schemas are given meaning through their being linked to concrete purposive behaviour which is seen from an ego-centric perspective by an observer. From the knowledge that events have causes, forces, paths, sources, goals, actors and settings, a prototypical, or as Langacker calls it, a "canonical" event schema is devised that typifies the "normal" relationships that hold among the actors in an event, and the forces that are either applied by or brought to bear on the actors in the event. The canonical event schema becomes a model with the same prototypic core 47 instances as the prototypical nominal has in its relation to less central members of a category. Within the canonical event, each actor or relational element represents a "role archetype" that is a direct result of the way that events are conceived. "These role archetypes", says Langacker, "reflect our experience as mobile and sentient creatures and as manipulators of objects" (1991:284). The combination of the roles agent, patient, instrument, experiencer, and mover (from place to place) results in the "complex conceptualization" that is, by the definition referred to above, an experiential gestalt. The idea that a "force dynamics" among concrete objects informs our conception of how events occur in the world is treated by Talmy (1985) and taken up by Langacker and called the "billiard ball" model of events (1991:283). The billiard ball model entails the primary and derived notions that make up the way we see the world and in turn directly affects linguistic structures. "Physical objects and energetic interactions provide the respective prototypes for the noun and verb categories, which likewise represent a polar opposition among the basic grammatical classes" (1991:283). Being derivations from prototypes, grammatical classes do not have discrete set members, but instead are "scalar", or more or less one or the other. Concomitantly, the members of grammatical classes are also scalar; there are no absolute criteria for thematic roles, or semantic cases, any more than there can be absolute distinctions of "verbalness" or "nounness". Different construals of a situation lead to different codings. Construal entails temporal perspective, affecting tense and time, and adverbial choices among others. In pronoun choice, for example, first person always refers to the speaker no matter who the speaker may be; with deictics: this or that is chosen depending on a referent's perceived distance from the speaker. However, the area of construal that is important for the present is that which can be directly 48 related to the case notions Subject, Object, Indirect Object, and the Peripheral, Scene Setting Phrases. For example, in the following sentences the subject may be either the agent, the patient, or the instrument depending upon which aspect of the situation is deemed to be central to the observer. 38) Jean hit the ball with a bat. Focus=agent 39) The ball was hit by Jean with a bat. Focus=patient 40) The bat hit the ball. Focus=instrument A central feature can be thought of as the focus or most salient feature. Each of these notions are, in turn, dependent upon the spatialization of objects present in the centre/periphery schema. Subject and object, as the central cases in any prototypical clause, are determined, then, not by their semantic roles but by their salience relative to an observer. They may be termed grammatical rather than semantic case since they are not indicators of agent, patient, instrument, or other semantic role, but, rather, the first and second most salient members of the basic figure/ground schema that underpins the picking out of related elements in a force vector schema. In the canonical event, an animate and conscious (human) agent causes a transfer of energy to an inanimate patient (object) which undergoes a change of state as a result of the action (Langacker, 1991:285). An instrument is an object used by the agent to affect the patient. The primary actors in a canonical event are provided with a setting in which their actions and reactions may occur. Both the construal of the actions and the setting are determined by the 49 observer of the event, who also positions the event in a temporal domain so that an event is conceived as a discrete entity occurring in time. In other words, an event construed in language consists of objects having energetic interactions in a spatialized and temporalized setting. The event, in total, is treated as an object made-up of smaller objects that comprise its content, and, as Traugott has argued, (1986, 1987, 1993) even the speech situation is conceived in the same terms as a concrete situation, and denotative devices, such as demonstratives indicating the location of an object, are adopted to express a speaker's attitude to a situation or the availability of an abstract concept for discourse purposes. Deixis proceeds in the expected dine from concrete to abstract use in which the target domains of mental attitude and discourse elements are conceived in terms of the "licensing" physical domain. Sweetser agrees that "there is ... a general tendency to borrow concepts and vocabulary from the more accessible physical and social world to refer to the less accessible worlds of reasoning, emotion, and conversational structure" (1988:31). The use of concrete vocabulary to express abstract concepts is both pervasive and motivated. 3.9 Grammatical Relations and Spatial Relations I will now argue that grammatical relations that use preposition markers in English are structured as analogues of the physical domain concepts that supply their lexical source material. If such is the case, then we should expect to find that grammatical elements using spatial prepositions are derived from a concrete image-schematic domain and are processed in the same manner as visual-kinesthetic information. Grammatical structure, therefore, is motivated by both the priority of spatial-kinesthetic experience (and processing) over social-50 abstract experience and its structuring, and by the pervasive tendency to imbue situations with the intentional mental state of the observer. For each preposition used in the language, there is a central prototypical use based upon an image-schema with spatial locative meaning. When prepositions are adopted to "case" uses such as Indirect Object, Instrumental, Beneficiary, Accompaniment etc., they retain their image-schematic spatial sense within the case sense. The prototypical sense of each preposition depends upon visual-kinesthetic percepts, while the case sense depends upon those same percepts plus the projected intentionally that an observer gives to objects and, by extension, to abstract concepts that are mentally manipulated as if they were objects. Lindner (1981), Smith (1989), Scjhulze (1989) and Sweetser (1984;1988) have already investigated some of the instances where prototypical meanings derived from image-schematic thought have provided a structural template for grammatical uses such as case in German (Smith, 1989) and modality in English (Sweetser, 1988). We should expect to find, in both diachronic and synchronic analysis, that prepositions diagram, or profile, situations based upon their prototypical core and the ability of language users to extend uses from the core to peripheral abstract situations. When the expressive repertoire changes in language, as when a case marker is eroded from word endings by phonological loss, new lexical elements must be used that have the potential to express the same relationships as the lost case markers. In the history of English, as far as it is discernible from extant writing, the case notions that have assumed prepositional marking are restricted to clausal elements other than subject and object. The subject-object relationship, as an instance of the directed path schema where energy flows from an effector to a receiver, can be recovered simply by the S V O order of the prototypical clause. 51 Here, word order is an analogue of energy transfer along a path, and the subject-object relation is revealed to be spatially motivated by the symbolic import of its linear sequencing. In other relations, however, it is the prepositions that have taken up the functional load of marking energetic interactions from the Old English affixes marking indirect object, experiencer-beneficiary, instrument, and possessor. In each instance a spatially motivated preposition has been used that, historically, was used to mark some aspect of the scene in which the event was taking place. Other scene setting adverbials, or oblique cases, such as those for time, location, cause, manner, purpose, etc. were always marked by prepositions and so did not undergo any change when case markers were lost in late Old English. In each instance, however, we can trace a spatial antecedent. Adverbial phrases of cause, manner, reason, and time using the prepositions by, from, of, with, for, to, at, on, in, after, and before all have a spatial antecedent on which their derived senses depend. The prepositional markers associated with roles such as beneficiary, experiences possessor, and instrumental have inherited their meanings from the basic level schema that also control their use in the oblique cases. These cases are closer to the central actors in the clause because, in the case of recipient, beneficiary, experiencer, and possessor, they act as secondary sources of energy with respect to active participation or experiencing, and are prototypically human, animate, and conscious. The instrument is located closer to the central actors because it has, schematically, a direct connection in the force dynamics of the situation in which energy is transferred from an agent through an instrument to a patient. The central cases in the "action chain" (Langacker, 1991), subject and object, and the tertiary cases or semantic roles, experiencer-recipient (indirect object), instrument, and possessor, which also have a role in the energetic transactions of the event, stand out as figures 52 against the background setting provided by the oblique cases. The figure/ground schema, in this instance, is particularized as a container/content schema where events are contained within a scene. The categorizing schemas that have licensed the shift from scene setting information to the relational information of the "case" markers is operative diachronically and synchronically. The particular schema that a preposition instantiates, and the uses to which it may be put, must be learned, but, once they are learned, both basic level and derived senses are used without conflict. Since synchronic language structure is the result of the accretions of conversational uses of the language over large stretches of time, we should expect to find that all the gradations of uses licensed by the originating concrete schema that exist in present day use will be semantically related. Studies such as Lindkvist's on The Local Sense of the Prepositions Oven Above, and Across and Lindner (1981) on "up" and "out" have shown that local and abstract uses of prepositions are semantically related. In present day English we find that the prepositions to, for, of, with, and by, for example, retain their original local uses while their extended uses cover the relational profiles of former case markers. Because the linguistic construal of a scene is a conventionalization of experiential force dynamics and a speaker's beliefs about how objects are related to each other in a spatial domain, we find that there is some latitude in prepositional usage depending upon the image-schema used to construe a situation. Sweetser uses the example of "the week ahead" and "the week behind" where, although the same calendar week may be referred to, the referent is embedded within a different conventional image of time. In other instances, we find that a scene can be structured differently depending upon how the observer conceives of the situation, or on how the 53 observer conceives the prepositional schemata to be conventionally sanctioned. For example, "I am concerned for your health", and "I am concerned about your health" differ only in the conception that the speaker has regarding the projection of concern from speaker to hearer. Concern is conceived as an entity that can either surround the hearer, as with about, or it can be projected to a position adjacent to the hearer, as with for. But consider "It is something I will have to think long and hard of" (heard on CBC Radio) where the speaker seems to have confused the conventional uses of think of and think about. We can think of something or we can think about something, but their meanings are generally thought to be different, as in "Think of a number from one to ten" where of has a prototypical value such that one number is a part of the whole set. Think about, on the other hand, means to consider from all sides, to mentally move around a problem and think of its many facets. If a speaker is confused about the prototypical meaning of a spatial marker or of its conventional use, he may very easily say "... long and hard of". The instances where prepositional usage is variable and there is no significant change in meaning seem to be a result of the prototypical meaning of the prepositions having peripheral uses that overlap with other prepositions, for example: Battle with the enemy Battle against the enemy With and against have portions of their schematic meanings in common since the energetic force of against requires proximity such as two surfaces being directly in contact. With in its oldest recorded sense (Beowulf circa A.D.225, OED) has the meaning against, so both proximity and force are factors in the schema in which agents contend. The meanings of with and against used in 54 conjunction with battle do not seem to be exactly alike, however, and nuances of meaning can be obtained. The core meaning of with in its present day use has lost the original denotation of bellicose adversativeness and retains only the sense of one thing in proximity to another. We can say, for instance, "I'm happy with the situation", whereas we cannot say, "I'm happy against the situation". We have here an example of a peripheral sense of Old English with becoming a central use in present day English, while against has continued to carry the central meaning of Old English with. With in present day English has a far greater scope of uses because it has a more abstract central meaning than against. The proximity of two objects does not necessarily indicate hostile relations, so we can be happy with, content with, play with, get away with, entrust with, agree with, dispense with, have done with, or begin with, whereas against used in any of these situations would be either nonsensical, or would create an entirely different situational "gestalt". With and at share peripheral instances of their meanings in the following: I am angry at you. I am angry with you Like against and with, at has, as part of its meaning, the image of one actor being in contact with another. The meaning of the preposition (that part of its schema that is put in profile) is controlled in part by the action of the verb. Angry for example requires an agent who can be angry and who directs the force of his anger towards an object. The concrete schema of an object being at a location (as in The hooded man is at the gate) is given an intentional dimension by the verb and is completed by the preposition and its governed noun. It is quite a different thing to say "I'm angry about you" than it is to say "I'm 55 angry with you". The first may mean "I am angry concerning some condition associated with you"; the second means "I'm angry directly at you". Another example where at and with have been confused is the following: "Many people are frustrated at the Federal system" (CBC Newscast February 23, 1995). On first reading the difference in the schematicized diagram of the situation does not really seem to change between "Many people are frustrated with ..." and "Many people are frustrated at ..." except that, conventionally, the verb phrase are frustrated and the preposition af apparently are not used together. But why is frustrated not conventionally used with af? Frustration can be with something but it cannot be projected at something. Anger on the other hand can be projected at something or can be more an internalized psychological state, as in be angry with something. It seems that frustration, and subsequently the verb phrase be frustrated, describe an internal psychological state only, and so cannot be projected af a target the way anger can be. The meaning of the verb and its complement must therefore be taken in consort with the preposition to construe the situation. The verb selects some feature of the prepositional schema and profiles that aspect of the schema. Verb and preposition together diagram a situation in terms of energy transfer and spatial relationship. 56 CHAPTER 4 Parallel Distributed Processing 4.1 The Micro-Structural Account Support for the cognitivist proposal that grammatical structures derive from prototypes and generalizations, and not from the rule based ordering of arbitrary symbols, has recently gained support from a section of the "Artificial Intelligence" community in the guise of a theory called connectionism, or PDP. Just as cognitive linguists have been dissatisfied with the "rules and representations" approach because it fails to explain such linguistic phenomena as synchronic polysemy, diachronic grammaticalization, or the structuring of grammatical categories by means of extended metaphors working from concrete to abstract concepts, Al researchers have been dissatisfied with the rules and symbols approach in the computer simulation of human cognitive function. Their main objection to rules and symbols is that the "rules", or the program that is used to say what symbols can go with what symbols, has to be imposed "from above", so to speak, and are not derived from the problems that the computer network encounters during functioning. Second, the "symbols" that a "rules and symbols" model uses have no intrinsic semantic content. Similar to Langacker's criticism of generative grammar, which denies a connection between the syntax and the semantic content of an expression, Al and cognitive researchers complain that symbols, separated from rules, must be imposed on the system where the imposition of the meaning of symbols has nothing to do with the "program" being used. Symbol meanings thus have no sensitivity to the context in which they are used. The result of a "top down" direction for rules and symbol meaning is that the classical architecture of 57 cognition has no way to account in a natural manner for human learning, or for the alteration of behaviour in the face of novel information. In the words of the connectionists, the "rules and symbols" implementation of human cognitive function is too "brittle" to account for the way human cognizing actually works. 4.2 Connectionist PDP Account of Categorization Connectionism argues that, instead of rules and symbols being imposed from above, the cognitive system generalizes from experience to produce highly context sensitive representations of the desired output. Symbols, from the connectionist perspective, are not the representatives of primitive concepts that account for our "language of thought" (Fodor & Pylyshyn:1988) but, instead, represent the multiple contexts that have been encountered in the making of the concept that the symbol represents. In order for the rules and symbols model of cognition to work, as I have said, both parts must be imposed from without. In computer terms this imposition is accounted for by a program, but in human terms it must be accounted for in terms of a "genetic program" that can control the entire syntax of all languages and for the meanings that symbols are allowed to have. The nature of cognition (Ramsey, 1992:250) requires that there be elements that can be used repeatedly and combined into a representation of the situation that a cognizer is thinking about. Elements in a cognitively natural system must be systematically related, since as one writer points out: There are on the order of 10 2 0 English sentences of twenty words or less. For most of these there is a potential corresponding thought, and potential thoughts outstrip sentences because of our ability to make relevant discriminations for which we lack linguistic resources. How could one explain the capacity to have so many systematically related thoughts except by the capacity to build them by repeatable components? (Horgan & 58 Tienson, 1992:198) The question is, does a system that uses repeatable elements in a syntax-like manner require that the representation of situations be built up by rules and symbols as has been argued by generative linguistics and classical Al? The classical argument assumes that lexical items in the language and the concepts of the "language of thought" are isomorphic. Concepts in the language of thought merely have to be encoded into a syntactic structure in order to be communicable as the propositions of language. The structural make up of the system requires that there be discrete symbols manipulated by definable rules. The connectionist alternative states that the representations of environmental stimuli that emerge in language as symbols acquire their semantic properties and their associative potential through the massively parallel distribution of sensory inputs through a multi-layered associative neural network. The effects of these artificial neural networks, connectionists say, mimic, in a far more natural way, the actual performance of human cognizers, and without the use of the classical computational paradigm. Paul Churchland states flatly that: neural nets typically have no representation of any rules, and they do not achieve function-computing abilities by following any rules. They simply "embody" the desired function as opposed to calculating it by recursive application of a set of rules listed in an externally imposed program. (1992:39) What follows is a brief description of how the connectionist model of a neural network "embodies" the functions of rules and symbols, without either being stipulated by an external program. A connectionist network is a computer simulation of what, it is hoped, are brainlike processing structures represented by the switching function of the computer. Using the analogy of neuronal excitation patterns and synaptic 59 connections among neurons, the connectionist model consists of a set of units arranged, generally, in three layers: an input layer, representing the receptors for a stimulus, a middle or hidden layer, representing the associative layer of neurons, and an output layer that produces effects in the world, i.e. speech or movement or recognition of shapes, etc. A set of units, comprising the three layers, form a network through the connections that occur between units. Each unit of the first layer is connected to each unit of the second layer, and each unit of the second layer is connected to each unit of the first layer and each unit of the third layer as follows: o u i p u t a s s o c i a t i v e v n p u . t Figure 2: NETWORK Adapted from Blank, Meeden, Marshall (1992:145) The input layer receives stimuli that cause an activation pattern across the receptors. Each receptor sends a scaled message, either excitatory or inhibitory, to the units in the association layer. The strength of connections depends upon the strength of the excitatory or inhibitory message sent by the input units to the middle layer. The strength of a connection is called its "weight" relative to other connections in the network. While excitatory and inhibitory messages initiate and diminish rapidly, the weighted connections either decay slowly over time, or are reinforced by subsequent stimuli. The weighted 60 connection levels are propagated through the associative middle layer to the output layer where they result in a pattern of activation on the output layer. 4.3 Learning A network can learn to adjust its output activation towards a target goal. It accomplishes this task by a training technique called "backpropagation" in which a large set of training examples are presented that have both an input pattern and a desired target output pattern. The input and output patterns are compared, and an error correction signal is introduced that propagates backwards through the network and adjusts the connection weights until the activation pattern through the middle layer produces the desired output pattern. Weight changes are gradual and respond to training by continuously shaping behaviour towards the target examples. The network is never given any rules to follow, only examples of the correct output and a method of adjusting excitatory and inhibitory impulses among connections to produce the output result. The input to the network is called an activation pattern, or input vector, because each pattern on the input layer produces a unique activation of the network that occurs as a result of the summation, of excitatory and inhibitory messages across the network. The competing forces of different excitatory and inhibitory messages coming from different units resembles the competition of forces in vector analysis. Because the output layer also consists of an activation pattern, the output activation pattern can also be related to a vector. Vector analysis is then used to produce diagrams of how the network has dealt with the training regime in terms of classifying related inputs and distinguishing among related, less related, and unrelated inputs as defined by the criteria established by the training examples. 61 Experiments have been conducted to test the performance of the model in a number of different areas. For example, networks have been trained to discriminate between rocks and mines by presenting frequency analysis of the sonar echoes of each class of object, but without defining what the difference between the two might be. The network learned to distinguish between the two classes without any information about what to look for, working from error correction information alone. When the activation vectors representing the configuration that the network produced on its middle layer were plotted in two dimensional fashion, it was found that the network had produced internal representations for prototypical mines and prototypical rocks. When presented with examples from outside of the training set, the network was able to discriminate among echoes it had never heard before (1992:35). Analysis of the vector space derived from training showed that the network had developed activity "hot spots" where the prototypical mine and the prototypical rock could be found. While each training example was different in its activation of the network, there were sufficient similarities along parameters never actually defined for the network to produce generalizations about the desired target output from the input. Vector regions were partitioned into categories useful for the task presented. Because the network develops prototypes, relevant discriminations can be made on degraded or incomplete input. A prototype region need not be precise, because there is more "space" between prototype regions than there is within regions. Thus the "mine region" clusters around a small dimension of total activity in vector space, and the "rock region" clusters around another relatively distant part of vector space. The most important point to consider here is that the network learned by creating prototypes from which it could discriminate inputs, and it "pushed" degraded inputs towards one activity region 62 or another depending upon their similarities to weights in already established pathways. Remember that Rosch has already demonstrated, on the basis of far different experiments, that humans use categorization in their thinking processes, and that categorization of physical experience into derived intentional states is a primary tenet of cognitive grammar. 4.4 Networks and Language Networks have been trained to generate the function of a room from its contents and to identify three-dimensional objects from flat grey scale pictures. But networks have also been trained to deal with language functions such as the identification of English phonemes, the distinctive, meaningful sounds of the language. A network was trained to transform English letters, of which there are twenty-six, into English phonemes, the distinctive sounds of the language, by training on a corpus of English words. The network "learned" to distinguish which phoneme was appropriate within a context by determining the most likely letters to occur on either side of the target letter. When the activation vectors were plotted in two-dimensional space (network activation vector space has as many dimensions as it has possible scalar connections among units) it was found that consonants and vowels were plotted in separate regions in hierarchically structured nested regions. Where the consonant and vowel region were closest, the sounds of the phonemes were the most similar, and, as the distance increased, similarity decreased. Similar results were obtained from an experiment where a network was asked to predict the next word in a sentence. The premise was that words in English do not appear in any order, but occur according to co-occurrence restrictions that influence what word will come next in a sequence. These co-occurrence restrictions stipulate a noun at one place, a verb at another, and 63 adjectives and adverbs at yet other, typical locations. Further, an animate subject noun will occur with a verb appropriate to an animate agent whereas a inanimate subject noun will occur with a verb that requires someone else other than the subject be the agent. The network was trained on 10,000 two and three word sentences, after which it could predict the class of word that would follow in a sequence. The network was able to achieve this result, once again, by discovering the characteristics of categories based upon the occurrence of types of words in specific contexts. W l l I mcoe. A nitrate 5 Figure 3: FRACTAL DIAGRAM Adapted from Elman (1992:147) 64 The network was able to produce quite detailed categorizations of words, and, once again, these categorizations were arranged in hierarchies of nested regions. The classification of each word was based upon all the instances of that word as it occurred in the sequences of words in the training set. The knowledge that the network has of any particular word amounts, therefore, to the sum of all the contexts in which the word has appeared. The network was also shown to be able to learn to handle long distance dependencies such as "Wh-movement" without that "rule" being available in any explicit form. The explanation for the network's successful learning of a "rule" seems to reside in the way that the training set, as a whole, is represented across all the units in the network. The network weights represent every training instance in a superimposed fashion across the whole network just as each instance of a word is superimposed across the network. Each activation of the input layer causes an activation pattern to propagate through the entire network. In such a way, an occurrence such as "relative clause movement" or centre embedding is represented throughout the network as a gestalt structure. Van Gelder summarizes this way: Roughly speaking, information is stored in a superimposed fashion when one cannot find a more "local" correspondence between various parts of the stored information and parts of the representation itself. In the classic example, a hologram is a genuinely distributed representation, because every part of the scene is represented over the whole surface of the hologram. In the current case, the hidden unit activation pattern which results from presentation of a sentence fragment is what has come to be called a "gestalt" of the relevant features of that whole fragment. (1992:182) The network simply remembers the "Wh-element" long enough to integrate it into the clause as a whole. Here, then, is representation without explicit rules in which a generalization is derived from specific instances that can be applied to 65 subsequent novel situations. Novel situations are fitted into vector state space by appropriating a unique point within a nested region organized in a cascading hierarchical structure "deep" enough to accommodate all the grammatical constructions that are likely to be encountered in a finite amount of time. The hierarchical structure of state space represents the "configuration of activation values which locates the representation in the state space. . . . It fixes the location of the representation such that the system can treat it as encoding structure" (1992:190) without "structure" having to be the concatenation of symbolic primitives by means of grammatical rules. The metaphor used to describe the relationship between grammar and neural functioning comes from fractal geometry, in which an infinitely deep cascading hierarchical structure can be generated that displays the highly suggestive property of having, potentially, an infinite number of points between any two points in the hierarchical structure. Such structures suggest both the systematicness and the structure sensitivity of processing in the connectionist network, and provide a way to explain the linguistic viability of syntactic patterns that are epiphenomonally expressed as recursion, constituent movement, co-occurrence restrictions (valence relations) among others. Take, for instance, a sentence such as : The cat that is on the mat that is covered with cat hairs found a mouse in the house. Any lexical item encountered in a sentence occupies a unique point in state space. The nature of fractal geometry allows each contextualization of a lexical item a unique position in vector space relative to all other items. This means that the meaning of a lexical item will vary depending on the other items with which it occurs. The cat is a part of the region for "prototypical cat" but it is further elaborated by on the mat which sends it into a region of "objects that have direct relations with top surfaces". Likewise, mats with and without cat hair reside on slightly 66 different but closely related parts of vector space. With each additional lexical item in a sequence, the activation pattern sends the vector representation into a more tightly defined space. By these means, the meaning of a clause is a function of the superimposition of each word on the activation pattern of the network in a sequence, the sum of which defines a unique context for those words and so a unique place in activation space. Such context dependency should remind us both of the dependency relations posited by case grammar between verbs and nominals and the dependencies that occur among lexical items in their capacities as instigators of conceptual schemas in cognitive grammar. Rumelhart (1992:69-83) argues persuasively that Parallel Distributed Processing networks support human cognitive architecture because they account for the kinds of thinking that humans are best at doing. He states that "we succeed in thinking and in logical problem solving by making the problems we wish to solve conform to problems we are good at solving" (1992:71). The problems that we are good at solving are: i) Pattern Matching: We "settle" on the meaning of an input quickly by matching it to an already existing activation pattern. ii) Modeling the World: We can project from a present state of affairs to a future state by internalizing our experiences. This internalized representation supports imaginative thinking and mental simulations. iii) Manipulating the Environment: We are able to manipulate objects in the world, represent them in our minds and make them mean something. Pattern matching, or reasoning by similarity is a basic tenet of the cognitive grammar perspective because, as previously stated, the experience of our bodies and of the visual and tactile environment are primary to the development of language. We use the patterns we find in our embodied 67 experience to structure our linguistic expressions. PDP supports pattern matching by reinforcing positive correlations and suppressing negative correlations; that is simply the nature of the system. Rumelhart states: An experience is assumed to result in a particular pattern of activation impinging on the memory units. Retrieval is assured to occur when this previously active pattern is reinstated over the set of memory units. . . . When a pattern similar to a stored pattern is presented, the system responds by distorting the input towards the stored pattern. . . . If a number of similar patterns have been stored, the system will respond strongly to the central tendency of the stored patterns. (1992:72) Thus a prototype is always produced even where no prototype has been presented. Pattern matching thus supports prototype formation and the pushing of subsequent experiences to conform or become meaningful by way of their apparent similarity to already entrenched patterns of experience. As already noted, physical schemas seem to support many of the grammatical constructions of English and of many other languages attested, particularly in the literature on grammaticalization. The physical schemas that support inferences of causation and temporal sequence are installed in the network prior to knowledge of language. The network can make generalizations to novel situations based upon its ability to predict the next feature of a sequence from previous experience where present experience in some way activates the pattern of the previous experience. Since objective experience is different in kind from linguistic experience, we must assume that the similarities between the two exist at some abstract level of analysis. I have argued that objective and linguistic experiences are the same at the level of cognitive processing, where pattern recognition in the objective realm established the pathways necessary for the production of language. 68 If we think of our experience with objects as producing a relational pattern as well as an object identification pattern, then experience with language, in which symbols are treated as objects, can be treated as a special case of a relational pattern. Rumelhart believes that analogical reasoning is accomplished by the network sorting through a set of features, beginning with the most concrete, until "a good fit" is found between the input and the stored pattern. If no concrete features can be found, the network goes on to look for more abstract relational features. The analogy between object relations and language relations, then, need not be at the level of concrete similarity; it can be a relationship "abstracted" from concrete relations. Just as the PDP network can predict the next word class in a sentence by comparing an input to a prototype, it can also predict what state of affairs will exist if a particular action is undertaken. The capacity for "mental rotation" and the ability to imagine the outcome of forces acting on objects are examples of such predictive capabilities. The production of a verbal representation of a situation can be reconciled with the idea that we are able to "run a mental simulation" (1992:78) of a set of actions that have not occurred objectively by connecting the use of verbal symbols with the ability to manipulate objects. We are good at dealing with concrete situations because we can manipulate objects both physically and mentally in a represented form. The closer that language elements can be made to resemble concrete objects, the more easily they can be manipulated. Linguistic symbols resemble patterns of concrete representation in that neither is a primitive representational unit; each is a summation of all experiences of the concrete or linguistic input as it has been distributed across the activation vectors of the network. We solve a complex problem involving concrete relations by breaking the problem down into smaller units where each action and its consequences can 69 be mentally simulated. By treating language and the situations described in language as concrete entities we reduce the complexity of processing to something we are good at, and something that has served as the model for other forms of processing, namely, the relating of objects together in a spatial domain. 70 CHAPTER 5 Conclusion I have concerned myself in this paper first with case grammar, because some of its advocates saw that the meaning of lexical items has more to do with grammatical structure than the early generative theories were willing to admit; second with cognitive grammar because it fulfills the promise of case grammar to find a semantic-conceptual motivation for grammatical structure, and third, with a connectionist model of cognitive architecture because it motivates the production of functionally useful prototypes, which are fundamental aspects of cognitive grammar. Each level of analysis that I have explored here has sought an explanation for the linguistic capacities of people within those underlying cognitive abilities that do not rely on innate or pre-existing categorial segmentation of experiential continua. Case grammar, cognitive grammar, and connectionism all accept that experience must be encoded and used in a form that is simpler and more coherent than experience as it is lived second by second or minute by minute. The encoding of experience and the uses to which such coding may be put depend, critically, upon the categorization of experience, which in turn is dependent on there being some observer perspective available to impose order on initial categorizations. While Gruber and Anderson, from the case grammar perspective, were very insightful in their use of the idea that verbal and prepositional meaning are derived from the concepts of movement and spatial orientation, entailing the placement of objects relative to each other, their overall theoretical perspective was limited by the hegemony of transformational grammar at the time of their 71 writing. Cognitive grammar has been able to make use of case grammar ideas such as pre-linguistic segmentation of experience but, instead of trying to make pre-linguistic schemata fit transformational theory, has been able to show that "surface" syntax is a symbolization of the categorizations and schematization from which linguistic expressions derive their organizing principles and their ability to organize experience. Cognitive grammar has been able to demonstrate that there is a metaphorical transfer from the concrete to the abstract domains of experience in which the categories and schemas of concrete experience are appropriated for use in grammatical constructions. The concepts of object manipulation and viewer perspective continue to be important, but with each abstraction from concrete experience there is a concomitant increase in the complexity of categories and a tendency to greater inclusiveness. For example, while time includes space, and space includes objects in positions relative to each other, objects do not seem to have much to do with time. Time is dependent upon the concept "objects", however, and at the same time, presupposes their existence. Objects, space, and time, therefore, are categories of experience where the construction of the first allows the construction of the second, and the construction of the second allows the third. We might say that each abstraction from concrete experience is an application of concrete experience in the construction of abstract domains. While connectionism for its part has been able to provide a "micro-structural" account of how prototypes develop and why they are necessary to language, perhaps its most important contribution has been to develop a way to account for the context sensitivity of linguistic expressions with reference to both linguistic and to extra-linguistic factors. In an infinitely deep cascading hierarchical structure a linguistic expression is sensitive to multiple context 72 constraints. Meaning does not result only from a literal decoding of the words in a string of words but also entails the world knowledge of the participant in a discourse, knowledge which itself is contextualized within a cascading hierarchy. I believe that each level of analysis supports in its own way the proposition that the continuum of pre-linguistic experience is segmented and categorized prior to its being applied to the linguistic domain, and that concrete experiences, as the fundamental raw data upon which pre-linguistic categorization takes place, has a pivotal role in subsequent linguistic structures. If the use of spatial information inherent in object identification is a prerequisite to the grammatical structure of language, then we should expect to find the same process at work in the history of a single language and in the synchronic study of a wide variety of languages. Today in English we have grammatical morphemes that are overtly spatial in their meaning; and this situation makes it easier to entertain such a thesis. But I have maintained throughout, that, whether or not the grammatical morphemes for case, tense, anaphora, deictics, or other relationally derived concepts are transparently derived from morphemes with primary spatial meanings, we should always be able to discern the spatial antecedent behind the grammatical morpheme. 73 REFERENCES Anderson, John M. 1971. The Grammar of Case: Towards a Localistic Theory. Cambridge: Cambridge University Press. Blank, Douglas S., Lisa A. Meeden, and James B. Marshall. 1992. "Exploring the symbolic/subsymbolic continuum: A case study of RAAM". In John Dinsmore, ed., The Symbolic and Connectionist Paradigms: Closing the Gap. (113-148). Hillsdale, NJ: Lawrence Erlbaum. Boden, Margaret A. 1982. "Implications of language studies for human nature". In Thomas W. Simon and Robert J . Scholes, eds. Language, Mind, and Brain, (129-144). Hillsdale, NJ: Lawrence Erlbaum. Brinton, Laurel. 1989. "Metaphor, metonomy, and iconicity: Some principles of semantic change". Semiotic Inquiry 9:137-49. Chafe, Wallace L. 1970. Meaning and the Structure of Language. Chicago: University of Chicago Press. Chomsky, Noam. 1965. Aspects of the Theory of Syntax. Cambridge, MA: MIT Press. Churchland, Paul M. 1992. "A deeper unity: Some Fyerabendian themes in neurocomputational form". In Steven Davis, ed., Connectionism: Theory and Practice. (30-50). New York/Oxford: Oxford University Press. Comrie, Bernard. 1991. "Form and function in identifying cases". In Frans Plank, ed., Paradigms: The Economy of Inflection. (41-56). Berlin/New York: Mouton de Gruyter. Dore, J . 1985. "Holophrases revisited: Their 'logical' development from dialog". In M. D. Barett, ed., Children's Single-Word Speech, (23-58). New York: Wiley. Elman, Jeffrey L. 1992. "Grammatical structure and distributed representations". In Steven Davis, ed., Connectionism: Theory and Practice. (138-178). New York/Oxford: Oxford University Press. Fillmore, Charles J . 1968. "The case for case". In Emmon Bach and Robert Harms, eds., Universals in Linguistic Theory, (1-88). New York: Holt, Rinehart, and Winston. Fillmore, Charles J . 1977. "The case for case reopened". In Peter Cole and Jerrold Sadock, eds., Grammatical Relations. Syntax and Semantics , vol.8 (59-81). New York: Academic Press. 74 Fodor, J . A. and Z. W. Pylyshyn. 1988. "Connectionism and cognitive architecture: A critical analysis". Cognition 28:3-71. Givon, T. 1985. "Iconicity, isomorphism, and non-arbitrary coding in syntax". In John Haiman, ed., Iconicity in Syntax. (187-219). Amsterdam/ Philadelphia: Benjamins. Gruber, Jeffrey S. 1976. Lexical Structures in Syntax and Semantics. Amsterdam: North Holland Publishing Co. Haiman, John, ed. 1985. Iconicity in Syntax. Amsterdam/Philadelphia: Benjamins. Heine, Bernd, and Ulrike Claudi. 1986. "On the metaphorical base of grammar". Studies in Language . 10:297-335. Heine, Bernd, Ulrike Claudi, and Friederike Hunnemeyer. 1991. Grammaticalization: A Conceptual Framework. Chicago: University of Chicago Press. Hopper, Paul J . 1979. "Aspect and foregrounding in discourse". In T. Givon, Discourse and Syntax. Syntax and Semantics, vol.12 (213-241). New York: Academic Press. Hopper, Paul J . , ed. 1982. Tense - Aspect: Between Semantics and Pragmatics. Amsterdam/Philadelphia: Benjamins. Hopper, Paul J . and Sandra A. Thompson. 1985. "The iconicity of the universal categories 'noun' and Verb.'" In John Haiman, ed., Iconicity in Syntax. (151-183). Amsterdam/Philadelphia: Benjamins. Hopper, Paul J . and Elizabeth Closs Traugott. 1993. Grammaticalization. Cambridge: Cambridge University Press. Horgan, Terence and John Tienson. 1992. "Structured representations in connectionist systems?" In Steven Davis, ed., Connectionism: Theory and Practice, (195-228). New York/Oxford: Oxford University Press. Jackendoff, Ray. 1987. Consciousness and the Computational Mind. Cambridge, MA: The MIT Press. Johnson, Mark. 1987. The Body in the Mind: The Bodily Basis of Meaning. Imagination, and Reason. Chicago: University of Chicago Press. Johnson-Laird, P. N. 1983. Mental Models: Towards a Cognitive Science of Language. Inference, and Consciousness. Cambridge, MA: Harvard University Press. 75 Lakoff, George. 1988. "Cognitive semantics." In Umberto Eco, Marco Santambrogio, and Patrizia Violi, eds., Meaning and Mental Representation, (119-154). Bloomington: Indiana University Press. Lakoff, George and Mark Johnson. 1980. Metaphors We Live By. Chicago: University of Chicago Press. Langacker, Ronald W. 1987. Foundations of Cognitive Grammar. Vol.1: Theoretical Prerequisites. Stanford: Stanford University Press. Langacker, Ronald W. 1988. "An overview of cognitive grammar". In Brygida Rudzka-Ostyn, ed., Topics in Cognitive Linguistics, (3-48). Amsterdam/Philadelphia: Benjamins. Langacker, Ronald W. 1991. Foundations of Cognitive Grammar, Vol.2: Descriptive Application. Stanford: Stanford University Press. Lindkvist, Karl-Gunnar. 1971. The Local Sense of the Prepositions Over. Above, and Across Studied in Present-Day English. Stockholm Studies in English, XXV. Stockholm: Almqvist & Wiksell. Lindner, Susan. 1983. A Lexico-Semantic Analysis of English Verb-Particle Constructions with Up and Out. Bloomington, IND: Indiana University Linguistics Club. Luraghi, Silvia. 1991. "Paradigm size, possible syncretism, and the use of adpositions with cases in flective languages". In Frans Plank, ed., Paradigms: The Economy of Inflection, (57-74). Berlin/New York: Mouton de Gruyter. Lyons, John. 1977. Semantics. Vols. 1-2. Cambridge: Cambridge University Press. Mandler, Jean Matter. 1984. Stories, Scripts, and Scenes: Aspects of Schema Theory. Hillsdale, NJ: Lawrence Erlbaum. Mel'cuk, Igor. 1988. Dependency Syntax: Theory and Practice. Albany, NY: SUNY Press. Miller, G. A. and P. N. Johnson-Laird. 1976. Language and Perception. Cambridge, MA: Belknap. Peirce, Charles Sanders. 1931. Collected Papers, ed. by Charles Hartshorne and Paul Weiss. Cambridge, MA: Harvard University Press. Piaget, Jean. 1945. La formation du svmbole chez I'enfant. Neuchatel: Delachaux et Niestle. 76 Plank, Frans, ed. 1991. Paradigms: The Economy of Inflection. Berlin/New York: Mouton de Gruyter. Ramsey, William. 1992. "Connectionism and the philosophy of mental representation." In Steven Davis, ed., Connectionism: Theory and Practice, (247-276). New York/Oxford: Oxford University Press. Rosch, Eleanor. 1978. "Principles of categorization." In Eleanor Rosch and Barbara B. Lloyd, eds., Cognition and Categorization, (27-48). Hillside, N.J.: Lawrence Erlbaum. Rumelhart, David E. 1992. "Towards a microstructural account of human reasoning." In Steven Davis, ed., Connectionism: Theory and Practice, (69-83). New York/Oxford: University of Oxford Press. Schulze, Rainer. 1989. "The meaning of (a)round: A study of an English preposition." In Richard A. Geiger and Brygida Rudzka-Ostyn, eds., Conceptualizations and Mental Processing in Language. (399-431). Berlin/New York: Mouton de Gruyter. Shepard, R. and J. Metzler. 1987. "Mental rotation of three-dimensional objects". Science 171: 701 -703. Sinclair, Hermine. 1989. "Language acquisition: A constructivist view." In Jacques Montangero and Anastasia Tryphon, eds., Language and Cognition. (7-16). Geneva: Fondation Archives Jean Piaget. Smith, Michael B. 1989. "Cases as conceptual categories: Evidence from German." In Richard A. Geiger and Brygida Rudzka-Ostyn, eds., Conceptualizations and Mental Processing in Language, (531-565). Berlin/New York: Mouton de Gruyter. Starosta, Stanley. 1988. The Case for Lexicase: An Outline of Lexicase Grammatical Theory. London: Pinter Publishers. Svorou, Soteria. 1988. "The experiential basis of the grammar of space: Evidence from the languages of the world." Ph.D. diss., University of New York at Buffalo. Sweetser, Eve. 1984. Semantic Structure and Semantic Change: A Cognitive Linguistic Study of Modality, Perception. Speech Acts, and Logical Relations. Ph.D. diss., University of California at Berkeley. Ann Arbor: University Microfilms. Sweetser, Eve. 1988. "Grammaticalization and semantic bleaching". In Shelley Axmaker, Annie Jaisser, and Helen Singmaster, eds., Proceedings of the Fourteenth Annual Meeting of the Berkeley Linguistics Society, (389-405). Berkeley, CA: Berkeley Linguistics Society. 77 Sweetser, Eve. 1990. From Etymology to Pragmatics: Metaphorical and Cultural Aspects of Semantic Structure. Cambridge: Cambridge University Press. Talmy, Leonard. 1985. "Force dynamics in language and thought". In William H. Eilfort, Paul D. Kroeber, and Karen L. Peterson, eds., Papers from the Parasession on Causatives and Aaentivitv. (293-337). Chicago: Chicago Linguistic Society. Talmy, Leonard. 1988. "The relation of grammar to cognition". In Brygida Rudzka-Ostyn, ed., Topics in Cognitive Linguistics. (165-206). Amsterdam/Philadelphia: Benjamins. Trask, R. L. 1993. A Dictionary of Grammatical Terms in Linguistics. London/New York: Routledge. Traugott, Elizabeth Closs. 1986. "From polysemy to internal semantic reconstruction". Proceedings of the Twelfth Annual Meeting of the Berkeley Linguistics Society , (539-50). Berkeley, CA: Berkeley Linguistics Society. Traugott, Elizabeth Closs. 1989. "On the rise of epistemic meanings in English: A case study in the regularity of semantic change". Language 65:31-55. van Gelder, Tim. 1992. "Making conceptual space", In Steven Davis, ed., Connectionism: Theory and Practice. (179-194V New York/Oxford: Oxford University Press. Wierzbicka, Anna. 1988. The Semantics of Grammar. Amsterdam/Philadelphia: Benjamins. 78 APPENDIX 1 Glossary of Terms Anaphor: - An anaphor is a deictic category in that it "points back" in the speech situation to a preceeding referent. An example is: I asked Jane to go, and she did. Case: - A case marker usually involves a distinctive ending on a nominal that marks its relation, either grammatical or semantic, to other constituents within the clause. In this paper, case included semantic and formal cases marked both inflectionally and prepositionally, because, it is argued, all cases are semantically motivated. - The core relations of "case theory" are the same relations that have traditionally been considered central to the notion of the nominal paradigms. The difference seems to be that case theory allows only semantic roles to be "true" cases while traditional practice allows the formal organization of nominal paradigms to dictate what is considered a case. In other words, although case theory and traditional thoughts on case share the idea of a central core of cases associated closely with the verb, case theory denies the validity of studying "superficial" case features such as nominative and accusative, dative and oblique, and, instead proposes that only "deep semantic cases" form a cross-linguistically significant set of grammatical-semantic primitives. Cognitive grammar and "surface case" studies such as Wierzbicka (1988) claim that superficial case features are always meaningful. Deictic: - A deictic marker specifies a referent relative to the person speaking. Examples include the pairs: this / that, I / you, here / there, where the specification of the referent is always taken from an egocentric position in a spatially defined context. In a grammatical construction, a clause that further specifies a nominal referent can be thought of as a deictic since it is a specification being "pointed to" by the speaker. An example is: The balloon that Pooh used to get honey burst. Since temporal distinctions are also based upon the placement of the speaker relative to the events spoken about, tense categories can also be considered deictic. Diachrony: - Concerns linguistic change occurring over time. 79 Gestalt structure: - A gestalt structure results when a number of parts function together to form a whole, which can itself be a part of another group of parts which functions as a whole. For instance, an image of the world can be thought of as an assemblage of parts, where there is no interpretive capability endowed by the image, or it can be thought of as a summation of individual experiential getalts, each of which contributes an interpretive capability to the experiencer in the face of each new experience. Gestalts are the sum of raw experiences which have been interpreted and fed back into the perceptual end of the experiential chain. Such feedback produces a shift in the kinds of interpretations that are possible subsequently. Grammatical (case) function: - Considered in traditional grammatical theory to be independent of semantic role. Traditional grammatical functions include subject, direct object, indirect object, oblique. Often both semantic and formal considerations have gone into grammatical function classification. - (Plank, Paradigms. 1991:179) speculates that the historical precursors to attested homonymies between grammatical cases in Old English nominative, accusative, genitive, dative, and instrumental, indicate that the more "peripheral" cases are derivations from the central cases, with nominative being the primary case. We can thus speculate about a derivational linkage between the quintessential object-like status of the nominative, and the status of other parts of the paradigm which are either acted upon by verbal notions (acc-dat) or are complex scene setting apparatus (abl - inst - loc). Each step down the scale, from nominative to locative, seems to bring with it a complication in the conceptualization process required to make the case meaningful. Metaphor: - Where one conceptual field is used to describe another. For example, in the conduit metaphor language is thought of in terms of a physical entity that can be sent from one person to another in a container. For example, It's hard to get the idea across to him (Lakoff & Johnson, 1980:11). - In the example, He ate it up, "up" is completion, and comes from the physical experience where things that are up are usually more full or complete than things that are down. 80 Metonymy: - Traditionally, metonymies use the part for the whole where we conceptualize one thing "by means of its relationship to something else" (Lakoff & Johnson, 1980:39); or, as in grammaticalization theory, there is a "structural analogy between conceptual fields" (Brinton, 1989:137). Because the meaning of an aspectualizer such as up has both a spatial and an aspectual meaning at the same time in the same verb phrase, it can be said that one is an analogue of the other. For example, "the children ate up the candy" (1989:137). i. direction ii. completion iii. time past The argument is that the temporal-aspectual force of up results from its being an analogue of the spatial sense. Morpheme: - The smallest unit in a language to which an independent meaning can be attributed. Case inflections are morphemes by this definition although they are not independent of the nominal to which they are attached. Nominal: - Any word or group of words that can be construed as a unitary entity. For example: . The boy in the red coat noun phrase i. Rupert noun ii. him pronoun v. going to town is fun gerund, verbal noun Polysemy: - Treated extensively by Sweetser (1990), polysemy refers to occasions where a single word has multiple meanings. Sweetser maintains that words used as grammatical markers are not only historically related to full lexical words, they are also semantically related. The meaning of a grammatical morpheme is motivated by its previous meaning. Often the change of meaning is not a change at all, rather, it is a change of use - from concrete to abstract, for example: to London (Locative) to John (Dative). 81 Preposition: - Typically heads a noun phrase and indicates the relation of the nominal element to some other constituent. For example, i. Pooh is hanging from a balloon adverb phrase - location Semantic Role (case): - The meaning relations between a nominal and other constituents in a clause " . . . classified from the point of view of the involvement of the entity denoted by that NP in the situation expressed by the clause, independently of its grammatical form" (Trask, Dictionary of Linguistics. 1993:249). For example: agent, beneficiary, instrument, experiencer, goal, source, path. Subject or verb complement: - That part of the verb phrase that completes the meaning of the subject as in Rupert is a bear, or further specifies some aspect of the verb as in she said that she would come. Synchrony: - Concerns the description of a language at any stage in its history. Some historical linguists consider it to be an artificial classification and prefer to think of synchrony as a snapshot of diachrony. It is useful, nevertheless, for gaining insights into grammatical relations at any given time. Wh-movemenf. - In transformational grammar, the apparent movement of a lexical element from its "natural" position in a clause to a sentence initial position is considered to be evidence of constituent movement. An example is: Tom likes who? becomes Who does Tom like? Government and Binding theory allows any constituent to move to any position, provided there is no rule to block the movement. 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.831.1-0098930/manifest

Comment

Related Items