UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Transfer of acquired rule to conditional reasoning as a function of content similarity, attribute dimension… Vavrik, John 1995

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata


831-ubc_1995-060772.pdf [ 2.08MB ]
JSON: 831-1.0088360.json
JSON-LD: 831-1.0088360-ld.json
RDF/XML (Pretty): 831-1.0088360-rdf.xml
RDF/JSON: 831-1.0088360-rdf.json
Turtle: 831-1.0088360-turtle.txt
N-Triples: 831-1.0088360-rdf-ntriples.txt
Original Record: 831-1.0088360-source.json
Full Text

Full Text

TRANSFER OF ACQUIRED RULE TO CONDITIONAL REASONING AS AFUNCTION OF CONTENT SIIv11LARITY, ATTRIBUTE DIMENSION SIZE, AN])ACQUISITION MODEbyJOHN VAVRIKB.Sc. The University of British Columbia, 1979M.A. The University of British Columbia, 1987A THESIS SUBMITTED N PARTIAL FULFILLMENT OF THEREQUIREMENTS FOR THE DEGREE OFDOCTOR OF PHILOSOPHYinTHE FACULTY OF GRADUATE STUDIES(Department of Educational Psychology and Special Education)We accept this thesis as conforming to therequired standardTHE UNWERSITY OF BRITISH COLUIVIBIAJuly, 1995©John Vavrik, 1995In presenting this thesis in partial fulfilment of the requirements for an advanceddegree at the University of British Columbia, I agree that the Library shall make itfreely available for reference and study. I further agree that permission for extensivecopying of this thesis for scholarly purposes may be granted by the head of mydepartment or by his or her representatives. It is understood that copying orpublication of this thesis for financial gain shall not be allowed without my writtenpermission.(Signature)Department of Z’’1T/O-&L t)Oj- 7 EP’’ (/17/O-€,The University of British ColumbiaVancouver, CanadaDate 1ttct, /‘Z /5SDE-6 (2)88)ABSTRACTIn this study, deductive reasoning with conditional statements of the form IF xTHEN y was investigated in a transfer of training paradigm. Subjects induced conditionalrules under different training conditions and then solved a series of deductive syllogismsbased on conditional rules the content of which was either related (near-transfer) orunrelated (far-transfer) to that of the previously induced rules. The conditions under whichthe conditional rules were induced varied in terms of the acquisition mode(prediction/diagnosis) and in terms of the size of the attribute dimensions from which ruleinstances were drawn (binary/trinary). The deductive reasoning task, as the criterion taskfor transfer, included all four types of conditional syllogisms (Modus Ponens, MP;Denying the Antecedent, DA, Affirming the Consequent, AC, and Modus Tollens, MT).The accuracy of deductive performance on near-transfer was significantly higher thanresponse accuracy on far-transfer on the DA & AC but not on the MP & MT argumentforms. Transfer of conditional rule knowledge induced in a predictive mode wassignificantly better compared to a diagnostic mode on the AC form. Overall, near as wellas far-transfer performance was similar for the binary and trinary conditions. In addition,neither the binary nor the trinary condition resulted in improved far-transfer performancecompared to controls. The results provide evidence against purely instance-based as wellas purely rule-based models of deductive reasoning. Instead, they suggest that conditionalknowledge is represented at an intermediate level of abstraction, and that conditionalreasoning is adaptive to the form of the deductive argument. Different types of deductiveargument types may trigger different reasoning strategies.11TABLE OF CONTENTSAbstractTable of Contents iiiList of Tables vList ofFigures viAcknowledgement viiI. ANALYSIS OF RESEARCH PROBLEMS AND ISSUES 1A.INm0DUCTION 1B. RATIONALE FOR THE STUDY 3C. RESEARCH ON INDUCTIVE RULE ACQuIsITIoN 6D. RESEARCH ON DEDUCTIVE REASONING WITH A CONDITIONAL RuLE 111. ANALYSIS OF RESEARCH IssuEs RELATED TO RULE TRANSFER AS A FUNCTION OFCONTENT SIMiLARITY. 132. ANALYSIS OF RESEARCH ISSUES RELATED TO THE EFFECTS OF ArrEIBuTE DIMENSIONSIZE ON FAR-TRANSFER OF AN AcQuIRED RULE. 153. ANALYSIS OF RESEARCH ISSUES RELATED TO THE INTERACTION EFFECTS BETwEENATTRIBuTE DIMENSION SIZE AND DEDUCTIVE ARGUMENT TYPES IN NEAR-TRANSFER OF ANAcQUIRED RULE. 174. ANALYSIS OF RESEARCH IssUES RELATED TO THE EFFECTS OF INDUCTIVE AcQUISmONMODE IN NEAR-TRANSFER OF AN ACQUIRED RULE. 21E. SUMMARY OF HYPOTHESES AND EXPERIMENTAL PREDICTIONS 25II. METHOD 28A. SUBJECTS AND DESIGN 28B. ExPERIMENTAL TAsKS AN]) MATERIALS 31C. APPARATUS 35D. PROCEDURE 35III. RESULTS 38A. ANALYSIS OF INDUCTiVE RULE TRAINING PERFORMANCE 38B. ANALYSIS OF DEDUCTIVE REAsoNING PERFORMANCE 46C. ADDITIONAL ANALYSIS OF TRANSFER DATA 57IV. DISCUSSION 61111A. SUMMAT1VE OVERVIEW 61B. IMPLICATIONS FOR THEORIES OF CoNDITIoNAL REASONING 62C. PROPOSED ACcoUNT OF CONDITIONAL REASONING 70D. IMPLICATIONS FOR EDUCATION 74E. LIMITATIONS OF THE STUDY AND SuGGESTIoNs FOR FURTHER RESEARCH 75REFERENCES 80APPENDICES 85APPENDIX A: AN ILLUSTRATION OF RULE TRANSFER BASED ON THE PROPOSEDhYPOTHESES 86APPENDIX B: INSTRUCTIONS TO SUBJECTS 90ivLIST OF TABLESTable 1: Means and standard deviations of number of instances and total processingprocessing time for differenr inductive rule acquisition conditions 39Table 2: Category frequencies for reported inductive learning strategies 42Table 3: Response frequencies for each option on a rule assessment item 44Table 4a: Target rule selection by reported strategy for binary group 45Table 4b: Target rule selection by reported strategy for trinary group 45Table 5: Scoring key for eight types of conditional syllogisms 47Table 6: Mean scores and standard deviations on deductive reasoning performancefor the binary group 48Table 7: Mean scores and standard deviations on deductive reasoning performancefor the binary group 49Table 8: Mean scores and standard deviations on deductive reasoning performancefor the binary group 50Table 9: Mean scores and standard deviations for overall deductive reasoningperformance 52Table 10: Proportion of subjects following either a conditional or biconditionalinterpretation 58Table 11: Transfer scores summed over all argument types by rule selection score 59VLIST OF FIGURESFigure 1: Experimental Design Layout 30Figure 2: Causes and effects under different rule truth-table classes 31Figure 3: Types of content used in the inductive learning task 32Figure 4: Task sequences used for each group 36Figure 5: Number of instances for rule acquisition 39Figure 6: Processing time for rule acquisition 40Figure 7: Overall transfer performance 53Figure 8: Far transfer by dimension size 54Figure 9: Near transfer by dimension size 55Figure 10: Near transfer by acquisition mode 57Figure 11: Transfer performance by rule score 60viACKNOWLEDGMENTSThis document is dedicated to myself in keeping with the inescapable conclusionthat the pursuit of a Doctoral Dissertation is essentially a selfish process of symbolic selfcompletion during which I alienated my friends, taxed the patience and goodwill of mysupervisor, and neglected to hug my wife each time she endured yet another thesisinduced mood swing. My dedication is also a celebration of the fact that this is the onlyparagraph I got to write without someone editing it.viiI. ANALYSIS OF RESEARCH PROBLEMS AND ISSUESA.IntroductionThe overall aim of this study is to gain a better understanding of how peoplereason with logical rules or propositions, particularly those that can be expressed in theaform “IF X THEN Y”. Such statements are called conditional statements, and reasoningwith them is referred to as conditional reasoning. Below are some illustrative examples ofconditional reasoning in different contexts:A physics student has learned that if two particles have opposite electric chargesthen they will attract each other. At some later time she observes two particles that attracteach other. What can she conclude about the electric properties of the particles?A school psychologist remembers from his graduate training that damage to thefrontal lobes results in poor performance on problems requiring metacognitive monitoring.He is presented with a student who does poorly on problems requiring metacognitivemonitoring (such as the Towers of Hanoi problem). What can he conclude about theneurological cause of the disability?An intern on an emergency ward remembers from her medical training that acertain toxic agent Tx causes skin to turn blue. She observes that a patient suspected ofpoisoning has pink colored skin. What can she conclude about the cause of the poisoning?These examples point to the ubiquity of conditional reasoning in decision making,problem solving, and scientific reasoning. (See also Braine & O’Brien, 1991; Lee, 1985;Piburn, 1990; Popper, 1959; Rips, 1990; Sternberg, 1986). Yet, despite their importance,most people reason poorly with conditionals, and the cognitive process underlying thisfundamental form of reasoning is not well understood (Evans, 1982; Evans, Barston &Pollard, 1983; Marcus & Rips, 1979; Markovits & Nantel, 1989).1As the examples illustrate, conditional reasoning typically involves two distinctsituations: an acquisition phase, where the “IF X THEN Y” rule is learned, and anapplication phase where the rule is used in conjunction with other information to makedeductive inferences leading to some general conclusion. Both situations occur naturally inacademic, professional, and everyday reasoning contexts, and both have been studiedintensely, but mostly independently, for over three decades (Lee, 1985). One reason forthe limited number of studies investigating inductive rule acquisition together withdeductive rule application may be the fact that the inductive learning phase typicallyinvolves considerable time commitment by subjects, and often requires detailed and timeconsuming design and development of appropriate instructional materials (Pollard &Evans, 1983; and Markovits 1988).Most of the work on rule acquisition has been traditionally carried out within theconcept-learning research paradigm. Typically, these studies investigated learning ofconcepts which were defined by simple logical rules from a set of positive and negativeinstances of the concept. In other words, these studies have been concerned with inductivereasoning (e.g. Bourne, 1970; Bruner, Goodnow, & Austin, 1956;).Research on the application, or utilization, of a conditional rule, on the other hand,has been the focus of another research tradition dealing with the other mode of humanreasoning, namely deductive reasoning (e.g. Evans, 1982; Johnson-Laird, Byrne, &Schaeken, 1992). Within this research tradition, the interest has been on observingreasoning performance on deductive reasoning tasks such as the Wason Selection Task,and various forms of syllogistic arguments.A key assumption motivating this study is that we cannot fully understand andimprove conditional reasoning unless we consider both the acquisition of the rule2(inductive learning), and the application of the rule (deductive reasoning) together (Lee,1985). It is well established now that many types of declarative and procedural knowledgeare often not used (accessed) in situations where they clearly ought to be used because fdifferences between acquisition and application conditions (e.g. Bereiter & Scardamalia,1985; Morris, Bransford, & Franks, 1977; Vosniadou & Ortony, 1989). It is reasonable toassume that the knowledge structures underlying logical rules, including the conditional,are no exception. Deductive use of a conditional rule may, therefore, depend on how therule was inductively acquired in the first place.The purpose of this study, therefore, is to determine whether the conditionsunder which a conditional rule is learned (induced) influence transfer of the rule toconditional (deductive) reasoning with the rule.The following section contains a discussion of current research problems andissues pertaining to the basic purpose of the study outlined above. The discussion has twoaims: (a) to help explain the rationale for this study, and (b) to develop a conceptualframework from which research hypotheses can be articulated. Since each of these aimsrequires a slightly different treatment of the issues, the rationale will be presented first,followed by a more detailed analysis of the research issues needed to develop theconceptual framework for specific hypotheses.B. Rationale for the StudyOne reason for studying transfer of an acquired rule is that it would advancecurrent understanding of human reasoning on the basis of past learning. To see why thismay be of interest, consider that despite a long history of research on deductive reasoning,3there is not an agreed upon position on the cognitive mechanism underlying deductivereasoning in general, and conditional reasoning in particular. Currently, there are at leastthree different views. According to one prominent perspective, people make deductionsby systematically applying logical rules of inference to argument premises and conclusions(Inhelder & Piaget, 1958; Newell and Simon, 1972). The opposing view holds thatreasoning is carried out by retrieving specific instances from memory (Griggs & Cox,1982; Polard & Evans, 1981). According to the third view, some combination of logicalrules and specific experience-based instances might be involved in propositional reasoning(Rips, 1990).Currently, the hybrid view appears most promising since consistent evidence forpurely content-free rules or specific instances has not been forthcoming (Smith, Langston,& Nisbett 1992). However, the current hybrid models of propositional reasoning have notbeen articulated in enough detail to allow the formulation of specific predictions that canbe empirically tested, or to inform educators interested in designing learning conditionsconducive to the acquisition of conditional knowledge (Lee & Lee, 1992).If differential reasoning performance was found with rules induced from twodifferent sets of instances, then a more detailed specification of the role of rules andinstances in hybrid models would be possible. An example of such a situation would betwo sets of instances that both have a flill conditional relational structure but differ insome other ways, such as in terms of the number of unique positive and negative instancesof the rule in each set. A concrete example would be the number of distinct cases ofsymptoms in the presence or absence of various disease agents studied by medical studentsto learn diagnostic rules of inference.Another reason for investigating how rule acquisition influences subsequentreasoning is that it may inform educational practice in terms of instructional strategies for4enhancing reasoning performance. As was pointed out above, despite the central role ofconditional reasoning in many cognitive tasks ranging from inferences about causes andeffects in scientific research to problem solving and decision making in applied fields suchas medicine, law, and engineering, conditional reasoning performance is generally poor,and inconsistent (e.g. Evans, 1982; Evans, Barston & Pollard, 1983; Markovits et al.,1989; Wason, 1966). While the reasons for this vary depending on which of the threetheoretical perspectives one holds, it is generally agreed that in order to reason effectivelywith a conditional rule, one must have an appropriate rule structure in memory (Braine &O’Brien, 1991; Lee, 1985).Just how conditional rule knowledge is acquired is not clear. What is clear is thatteaching the logical form of a conditional rule by direct instruction alone is not effective(Cheng, Holyoak, Nisbett, & Oliver, 1986). On the other hand, Lee (1985) was able tosignificantly improve conditional reasoning performance through training that contained,as a key component, inductive rule learning from a set of instances with a conditionalstructure. However, Lee did not manipulate the parameters of the inductive rule learningtask and therefore it is unclear whether different inductive learning conditions are equallyeffective in building an appropriate conditional rule structure. If it could be shown thatdifferent rule learning conditions do result in differential reasoning performance, it wouldoffer educators more options for designing learning conditions to enhance thisfundamental reasoning process. The rationale for this study is therefore based on boththeoretical, as well as practical, instructional considerations.Acquiring and Reasoning With Conditional Rules5An overview of research issues and problems:-As was afready alluded to in theintroduction, virtually all previous research on reasoning with logical rules in general, andconditional (IF.. THEN..) rules in particular, with the exception ofLee (1985), has beencarried out independently either from the perspective of rule acquisition (inductivereasoning research tradition), or rule application (deductive reasoning research tradition).Before developing hypotheses about transfer between the two modes of reasoning in thecase of the conditional, a review of relevant findings from each of the two researchtraditions is necessary.Whenever any findings from the separate traditions imply something about transferfrom rule acquisition to rule application, such implications will be pointed out. In addition,relevant assumptions underlying cognitive processing and knowledge representation in thelearning and usage of logical rules will be identified. It will then be possible to proposespecific hypotheses about the influence of different acquisition conditions on transfer of anacquired rule to conditional reasoning. This chapter will conclude with a summary of thehypotheses along with a set of testable predictions derived from them.C. Research On Inductive Rule AcquisitionRule acquisition as concept learning: Concept learning studies have traditionallyinvestigated two learning processes involved in the acquisition of concepts defined bylogical rules: attribute learning and rule learning (Bourne 1970; Haygood & Bourne, 1965;Lee, 1984). Within this research paradigm, the assumption has been that most concrete aswell as most abstract concepts can be defined in terms of criterial attributes (or features)and some rule-like relationship between them:C=R(x,y...)6where C stands for a concept, R is a rule, and x, y... are the defining attributes (Bourne,1970). The kinds of rules that have been investigated in this research tradition have beenrules derived from propositional calculus of formal logic. The conditional rule is one offour basic relationships (rules) that represent the full calculus. The others are theconjunction (and), disjunction (or), and the biconditional (1ff). Experimental tasks used inthis concept learning paradigm include:1. Attribute Identzjlcation:-Under this paradigm, subjects are given a rule (e.g. a conjunction of twoattributes), and their task is to learn the relevant attributes (e.g. red, circle) from a set of instancescontaining both relevant and irrelevant attributes.2. Rule Learning:-In rule learning subjects are told what the relevant attributes are and their task is tolearn the relationship (rule) between them.3. Rule Identiflcation:-Within this paradigm subjects are informed what the relevant attributes are, andpreviously they have learned two or more rules. Their task is to identify how the previously learned rulesapplies to the present set of instances.4. Complete Learning:-In this learning paradigm subjects are simply introduced to various attributes(colors, shapes, types of events), are told that the concept involves a certain number of these, and theirtask is to learn both the relevant attributes as well as the appropriate rule.The process of rule acquisition:-How do people acquire concepts defined bylogical rules? Bout-ne (1970) proposed that people utilize “an intuitive version of thelogical truth table”. A formal truth table for the conditional “IF X THEN Y” (usuallywritten as “X ——> Y”) is shown below:x yT T +T F -F T +F F +According to Bourne, rule acquisition involves building a mental representationthat reflects this kind of truth table structure. This is accomplished by two componentprocesses referred to as attribute coding and rule mapping (Lee, 1984). During attribute7coding, the relevant stimulus features, (e.g. X and Y), are collapsed into the four truthtable classes XY, —XY, X-Y, and -X--Y. This is followed by mapping each of theseclasses onto the two response categories of the rule (+ or-), that is, positive or negativeinstances.One source of evidence cited in support of this assumed mechanism is theobservation that with increased experience on a concept learning task, subjects begin tocategorize all instances of a particular truth table class in the same manner (Bourne, 1970;1974). Lee (1984) subsequently established the sufficiency of the two componentprocesses by demonstrating that training subjects on each component can lead toimproved rule acquisition. With respect to attribute coding, Lee argued that since thisprocess comprises affirmation/negation and combinatory operations, attribute codingdifficulty should in turn depend on the difficulty of these two operations. But since thenumber of combinatory operations is the same for all truth table classes, attribute codingdifficulty should be a ifinction of the negation operations that are involved or, in Lee’swords, “the number of attributes requiring negations”. What Lee is referring to here is thenumber of implicit references to the negation of the target attributes - which is essentiallythe notion of contrast sets to be discussed later in the section reviewing research ondeductive reasoning.With binary dimensions, the number of negations is 0,1,1,1 for TT,TF,FT, and FFclasses respectively, while with trinary dimensions the numbers are 0,2,2,4. Indeed Lee(1984) found that error patterns for the four classes, during rule acquisition, wereconsistent with these theoretical difficulty indices.The encoding issue is relevant to this study. It suggests that transfer of a rule maybe influenced by the number of instances from which the rule is acquired. Because a smallnumber of dimensions is easier to encode, transfer to deductive reasoning that is based on8recall of specific instances should be facilitated by sets of instances with few dimensions.Situations where access to specific instances in deductive reasoning is possible aresituations where the rule content is the same during acquisition and application. “Rulecontent” refers to the specific semantic propositions, the “X’s” and the “Y’s”, referenced bythe rule “IF X THEN Y”. This situation is referred to as “near-transfer” condition.In contrast to instance-based models of reasoning, rule-based accounts would notpredict differences in near-transfer as a function of dimension size. However, according torule-based accounts, deductive reasoning with content that is different from the acquisitionphase (referred to as “far-transfer” condition) should be better with rules induced fromsets of instances with widely varying attribute dimensions because the increased variabilityof training set is likely to induce a more generalizable, content-independent knowledgestructure. Studies of far-transfer have shown that increasing the number of differenttraining examples from which new knowledge (such as a problem schema, for example) isinduced, makes that knowledge more applicable (i.e. generalizable) to new situations (e.g.Gick & Holyoak, 1987; Holyoak & Koh, 1989). The implications regarding near and far-transfer are developed more systematically later; they were introduced here only becausethey arise naturally from the discussion of rule acquisition and concept learning.Pragmatic constraints in rule acquisition:-As Lee (1985) noted, we cannot assumethat a conditional rule structure, induced in a traditional concept learning task, isautomatically associated with a conditional proposition “IF X THEN Y”. This is becausethe set of instances (e.g. cause-effect pairs such as cl-el, c2-el, c2-e2,...) correspondingto a target rule (e.g. “IF cause = ci, THEN effect = el”) warrants many valid propositionssuch as “when ci occurs then ci follows”, or “when c2 occurs then ci follows”, etc.Clearly, the reasoner must rely on some other principle or method to arrive at the targetrule expressed in the appropriate causal language.9One way to accomplish this is to employ a prediction task using explicit linguisticexpressions of the form “IF cause = ci THEN effect = ?“ and guiding the learner, throughprompts and appropriate feedback, toward the correct formulation of the conditional rule(Lee, 1985). However, this may not be the only, or the best method available, particularlyin naturalistic settings where such explicit and structured guidance is unlikely to occur.What other principle might be invoked?Recent work on inductive rule-based mechanisms points to pragmaticconsiderations as likely candidates (Holland, Holyoak, Nisbett & Thagard, 1986). Hollandet al. note that in natural contexts such as scientific or medical research, pragmaticconsiderations, typically the goals of the researchers, can constrain inductive inferences.In this context, this means that some pragmatic considerations may constrain the numberof possible propositions that can be made about a set of observed relations.Specifically, one pragmatic and fi.indamental goal of research is to deriveparsimonious and general statements about a set of observations. In fact, such statementsare often elevated to the status of scientific laws. If the pragmatic principle of generalityand parsimony is applied to a set of observations with a conditional structure, it constrainsthe possible valid propositions to one that optimally summarizes the observed set relations.That proposition is precisely the material conditional: “Whenever cause ci occurs, effectel follows”, or equivalently, “IF cause = ci, THEN effect = el”.In other words, pragmatic constraints such as goals enable the construction ofappropriate lexical entries to the underlying rules. Furthermore, if the traditional conceptlearning task incorporates a pragmatic context such as a predictive or diagnostic context,which utilizes the IF.. THEN.. language naturally, not only the target rule, but also itsassociated causal language should be readily acquired.10To conclude then, the evidence presented suggests that conditional knowledgemay be induced from appropriately selected sets of instances using traditional conceptlearning tasks. Furthermore, pragmatic constraints may be utilized to associate aconditional (IF.. THEN...) statement with the induced knowledge representation.D. Research on Deductive Reasoning With a Conditional RuleThe specific objectives of the present study are to assess transfer effects of ruleacquisition variables on deductive reasoning performances. Since the transfer criterion taskis a deductive reasoning task, it is useful to first describe the two most commonly usedtasks in deductive reasoning research. An overview of these tasks may also facilitate theinterpretation of research findings reviewed in this section.The Wason Selection Task:-Most conditional reasoning studies have utilized somevariants of the now classic Wason’s Selection Task (Wason, 1966). In a typical version ofthis task, a subject is shown a pack of cards and is told that each card has a letter on oneside and a number on the other. The subject is then told that the letters and numbers arerelated by a rule which states that: ‘Tf There is a vowel on one side, then there is an evennumber on the other side “. The subject is then presented with four cards laid down infront of her (e.g. A, K, 2, 7). The subject’s task is to select those cards, and only thosecards, that would be necessary to turn over in order to determine if the stated rule doesindeed hold. The correct selection is Card A and Card 7. In other words, the task involvesthe verification of a falsifying claim with a set of data (instances). The problem with thistask is that it assesses only one of several forms of deductive inferences. A more completetest ofconditional reasoning is the syllogism task (e.g. Aristotle; Marcus & Rips, 1979;11Lee & Lee, 1985, 1992; Woodworth & Sells, 1935). This is the task used in the presentstudy.The Conditional Syllogism Task: This task requires subjects to either derive orvalidate conclusions to arguments which have a conditional rule as their first premise.There are actually four distinct forms of deductive arguments with conditionalpropositions:Modus Ponens (MP) First premise: ifC Then ESecondpremise: CConclusion: EModus Tollens (MT) First premise: If C Then ESecondpremise: not EConclusion: not CDenying the Antecedent (DA) First premise: if C Then ESecondpremise: not CConclusion: IndeterminateAffirming the Conseiuent (AC) First premise: if C Then ESecondpremise: EConclusion: IndeterminateVirtually all data on conditional reasoning have been obtained using some form ofthese two tasks. Each of the following four sections presents critical analyses of theresearch findings and identifies research issues that are investigated in this study. Thesections conclude with statements of specific research hypotheses.121. Analysis of Research Issues Related to Rule Transfer as a Function of ContentSimilarity.Effect of “familiar” rule content (semantics) on conditional reasoning:-Both theWason task and the syllogism task have been tried with abstract rules such as “IF ATHEN 2”, as well as with rules that refer to familiar world knowledge such as “IF a metalbar is heated THEN it expands”. Reasoning performance with familiar rule content isalways different from abstract content, and often it is better, particularly on the Wasontask (Pollard & Evans, 1987; Evans, 1982; Griggs & Cox, 1982; Markovits, 1986; Wason& Shapiro, 1971). It should be noted here that rule content means the objects or eventsreferenced by the conditional rule.In the syllogism tasks, familiar content gives rise to what has been called the“BeliefBias” effect (Evans, Barston, & Pollard, 1983). It is manifested by subjects tendingto confirm conclusions which they believe to be true, and discon±irming conclusionsbelieved to be false, regardless of the argument structure from which the conclusion wasdeduced.Clement & Falmagne (1986) have shown that reasoning with conditional rulesimproves when the “semantic relatedness” between the antecedent and the consequent ishigh, or when the “imagery value” of the conditional statement is high. This suggest thatreasoning may be facilitated by access to particular instances or events in memory -instances that constitute relevant information to the inferencing process.The above research findings suggest that access to specific instances ofantecedent-consequent relations stored in memory may influence conditional reasoning13performance. More specifically, better access to rule instances should lead to improvedreasoning performance with the rule.Content familiarity and rule transfer:-Consider the following two transfersituations. In the first situation, a conditional rule used in a deductive task references thesame entities or events that the reasoner encountered when initially learning the rule. Atypical example is the case of medical training where students learn the relationshipbetween specific disease agents and disease symptoms. The agents and symptoms aretypically the same ones the students will encounter later in their practice. So although thereasoning process changes from induction (learning the relationships) to deduction(inferring a diagnosis based on presented symptoms and applying the learned rule), theactual entities involved in the reasoning (the specific symptoms and agents) are the same.This is what is meant by the term near-transfer.In a second type of transfer, the entities and events encountered during ruleacquisition are different from the entities or events contained in the premises of deductivearguments. This situation represents afar-transfer condition. Here the availability andaccess to specific instances will not be helpful and deductive reasoning performance maysuffer.On the basis of the effects of content familiarity reviewed above, it istherefore hypothesized that content similarity inherent in near-transfer conditionswould facilitate more effectively access to the conditional rule structure, and toapply it to deductive inference, than that prevailing in far-transfer conditions.142. Analysis ofResearch Issues Related to the Effects of Attribute Dimension Size on Far-Transfer of an Acquired Rule.The basic question posed here is whether rules induced from a wider variety ofinstances are more generalizable, and thus more applicable to a wider range of deductivereasoning situations than smaller sets in inductive rule acquisition. It is a ubiquitousobservation that when limited human memory capacity is overloaded, one has to bring intoorder (pattern) an array of complex stimuli (contents). Similarly in the reasoning tasksituation. The reasoner must rely on more general reasoning rule structures to reachcorrect conclusions. These are discussed next.The Role of Content-free Inference Schemas in Deductive Reasoning:-In additionto the evidence of content effects on deductive reasoning reviewed in the precedingsection, there is also evidence that some forms of deductive reasoning are not affected toany significant degree by rule content (Braine & O’Brien, 1991; Braine, Reiser, & Rumain,1984). For example, when subjects are confronted with the following argument:Major premise: Either P or QMinor premise: Not PConclusion?most correctly conclude that “Q” must be the case, regardless of what P and Q stand for.In other words, in this case, reasoning performance does not depend on specificpropositional content in the argument premises.15The example illustrates what Braine et al. call an “inference schema”. An inferenceschema refers to an abstract form of the rule structure concerned, and determines whatconclusion can be inferred from the general form of a given argument or a rule, rather thantheir particular content. According to Braine et al., inference schemas play an importantrole in human deductive reasoning because they can be used to derive new inferences froma given set of premises.Examples such as the above have been cited in defense of the view that formallogical rules underlie much of human reasoning. According to this view, people canacquire generalizable rule structures, such as inference schemas, which they apply in theirdeductive inferencing. Because these rule structures are sensitive to the form rather thanthe particular content of rules, they can be applied across a wide range of reasoningsituations (Rips, 1990).Rule transfer and schema induction:-The question of where generalized inferenceschemas originate has received little attention. In fact, as Lee (1985) pointed out, theexistence of such logic structures is usually just assumed. It was noted earlier that using aninductive rule learning paradigm, Lee (1985) was able to induce a generalizable, content-free conditional logic structure in a significant proportion of his subjects. However, in hisstudy, Lee did not vary the number of instances (examples) from which the logic structurewas induced, although trinary dimensions were used.Researchers interested in transfer of cognitive skill (e.g. Singley and Anderson,1989) have repeatedly found that the generalizability of a rule or a cluster of rules (aschema) increases with exposure to a wider variety of examples (e.g. Catrambone &Holyoak, 1989; Gick & Holyoak, 1987). Their research suggests that the degree to whicha conditional rule is independent of particular content (i.e. is more generalizable) may be a16function of the number of rule instances provided in the acquisition phase. More instancesshould lead to the induction of more content-free rules.The examination of related literature clearly indicates the merit of investigating theissue surrounding the usefulness of content-free rules. More specifically, it ishypothesized that under the conditions where the rule content does not match theacquisition content, and cueing of specific relevant instances is unlikely,generalizable, content-free rules, if available, need to be activated to meet the taskdemand of deductive inferences. In this context, such rules are hypothesized to bemore readily acquired from exposure to large rather than small sets of exemplars.3. Analysis ofResearch Issues Related to the Interaction Effects Between AttributeDimension Size and Deductive Argument Types in Near-Transfer of an Acquired Rule.Awareness of alternatives in conditional reasoning:-Although a conditional rulesuch as ‘IF cause THEN effect” explicitly refers to only one cause and one effect, itimplicitly references other, alternative causes and effects (e.g. “IF we eat candy THEN weget cavities” but drinking Coke will also do the job). These alternatives are in fact essentialcomponents of the conditional logic structure and they play a key role in conditionalreasoning. Consider the conditional rule “IF cause = ci, THEN effect = el”. A number ofresearchers have shown that awareness of or familiarity with alternative causes (c2,c3) ofa given effect (el) improves reasoning with indeterminate forms of the conditionalsyllogism, namely the DA and AC forms (Cummins, Lubart, Alksnis & Rist, 1991;Markovits, 1984). The central idea here is the availability ofalternatives along a givenattribute dimension (e.g. the number of alternative causes of a particular diseasesymptom). This notion is captured in the concept of contrast set which is discussedbelow.17Several studies investigated reasoning with arguments and rules containing negatedpropositions (e.g. ‘IF NOT A THEN 2”) and found that performance is more error-proneand time-consuming when negations are present (Evans, 1982; Oaksford & Stenning,1992). Lee et al.(1992) ranked the four basic syllogisms according to a hypothesizeddifficulty level where one of the difficulty parameters was the number of negations thatneeded processing. For example, the Modus Tollens (MT) type was one level above theModus Ponens (MP) type because it involved two more negation operations. Overall, theresults supported the contention that higher levels were more difficult to process.Why is reasoning with negated propositions more error-prone? According to oneinfluential account, the reason is that our natural language-understanding mechanismcontains a “NOT-heuristic” which focuses attention only on the named proposition that isnegated, and not on all the propositions implied by the negation. Since the impliedinformation is often critical for correct deductions, more errors occur (Evans, 1982,1984). Another hypothesis is that negations increase processing load because they involvean extra mental operation, similar to a NOT operator in a computer implementation ofBoolean logic (e.g. Lee et al., 1992).However, neither of these accounts can explain why some negated propositionsare more error-prone than others, as was discovered in a recent study that investigated theeffect of negated constituents more systematically than had previously been done(Oaksford et al., 1992). These authors claim that negated propositions which are easier toprocess are ones that facilitate mental construction of contrast sets.Contrast Sets and Conditional Reasoning:-According to Oaksford et al., premisesthat contain negated arguments cannot be directly represented as specific instances in18memory because it is not clear what instances the negated argument actually represents.To see why, consider the following deductive task:Major premise: If X is on one side of a card, then Y is onthe other side.Minor premise: Xis not on one side.Conclusion:Clearly, the minor premise can represent an infinite variety of potential instances(any letter, any number, in fact anything at all). As a result, a reasoning program will bedifficult to implement.Now, according to this account, before a premise containing a negated argumentcan be processed, the reasoner must first determine the contrast set implied by theproposition. A contrast set is a set of all objects or events implicitly referenced by anegated proposition. The difficulty of some negated propositions, like the abstract oneabove, is that they reference large, ambiguous contrast sets, whereas other propositionslike “NOT AN EVEN NUMBER” imply a smaller, better defined contrast set, namely theset of “ODD NUMBERS”. According to Oaksford et al., ambiguous contrast setsincrease cognitive load and consequently reasoning performance suffers. The problem withthe Oaksford et al. study, however, is that the relationship between negated propositionsand their implied contrast sets was only assumed. A more direct empirical test of thishypothesis is needed.Contrast set size and rule transfer:-How might contrast set size influence transferof an acquired rule to conditional reasoning? The findings suggest that the size of thecontrast set might interact with argument form to influence reasoning performance. Largecontrast sets increase cognitive load because more elements need to be processed.19However, large contrast sets also contain more alternative antecedents and consequentsassociated with a rule. As was discussed earlier, awareness of these alternatives improvesreasoning on indeterminate argument types, namely the Denying the Antecedent (DA) andthe Accepting the Consequent (AC) type. It seems reasonable to propose, therefore, thatreasoning with indeterminate argument forms should be more accurate with rulesinductively acquired from large contrast sets. On the other hand, reasoning withdeterministic argument forms that also contain a negation, namely the MT type, should bemore difficult with rules acquired from large contrast sets. This is because in this caseawareness of alternatives is irrelevant, and so the only impact of the large set size is toincrease cognitive load, which will hinder the reasoning process.To illustrate this line of reasoning more clearly, consider a conditional rule such as“IF cause = ci THEN effect = e 1” learned from exposure to either one of two sets ofinstances: one having binary attribute dimensions (e.g. two causes, ci & c2 and twoeffects e i & e2), and another having trinary dimensions (e.g. three causes ci, c2 & c3 andthree effects el, e2 & e3).Consider the DA argument type first. In this type of syllogism, the minor premise“NOT ci” references the contrast set {c2} in the case of binary dimensions, and {c2, c3 }in the case of trinary attribute dimensions. Since each cause is presumed to be associatedwith an effect, ei, e2 and el, e2, e3 become activated in memory in the case of the binaryand trinary dimensions, respectively. Since a greater number of different effects isactivated with trinary dimensions, correspondingly larger number of conclusions ispossible. As a result, an indeterminate response, which is the correct response for the DAtype, becomes more probable with the larger dimension size.20Next consider the AC type. Here, affirming the consequent proposition “ci”activates { ci, c2) and { ci, c2, c3 } for the binary and trinary dimensions, respectively.Since more alternative causes become activated in the trinary case, the correctindeterminate conclusion should be reached more likely in this case, as compared to thesmaller dimension case.Finally, consider the MT type. Here the negated consequent “NOT e 1” references{e2}, and {e2, e3 } in the binary and trinary dimensions, respectively. The associatedcauses that become activated are {c2} for the binary, and {c2, c3 } for the trinary case.Both of these activated sets represent contrast sets for the proposition “NOT ci”, whichhappens to be the correct conclusion for this argument type. However, since the contrastset for the binary condition contains only one element { c2 }, it should be processed moreeasily than the larger set { c2, c3 }. Therefore, the smaller dimension size should lead tobetter performance on the MT argument type.The foregoing analysis leads to a hypothesis that increased availability ofalternative antecedents and consequents associated with larger sets of induced ruleexemplars should be particularly helpful in reasoning on the two indeterminateargument types, DA and AC. On the other hand, a determinate type that requiresprocessing of a negated proposition (MT) should be easier with rules acquired fromsmaller sets because cognitive load is reduced, and because the availability ofalternatives is not a factor with determinate forms.4. Analysis of Research Issues Related to the Effects of Inductive Acquisition Mode inNear-Transfer of an Acquired Rule.21Another question concerning inductive rule transfer is whether the rule wasinduced in predictive or diagnostic contexts. A brief review and an analysis of relevantresearch findings, and an articulation of relevant theoretical assumptions is in order.The central idea in this regard is the ability of a reasoner to access instances of arule during deductive reasoning. The argument put forward here is that if instance-basedaccounts of reasoning are correct, then conditional reasoning should be facilitated byaccess to specific instances of an acquired rule. In turn, memory access should befacilitated by transfer-appropriate processing. Two processing modes explored here arethe predictive and the diagnostic modes. What follows is an attempt to apply the principleof “Transfer Appropriate Processing” to the transfer of a conditional rule, where“processing” refers to the directionality of reasoning involved in predictive and diagnosticmodes of reasoning.The problem is to identif,’ specific parameters of the inductive rule learning taskthat might affect the ease with which rule instances can be accessed during the deductivereasoning phase. First, it is necessary to state some assumptions about representationaland processing aspects of memory implied by instance-based accounts of reasoning.Assumed representation of instances in memory:-What might a memoryrepresentation of rule instance look like? Anderson & Bower (1973) and Hinton andAnderson (1981) proposed a simple model of verbal associative memory where objects orevents, which may include conditional rule knowledge of antecedents and consequents, areconnected by bidirectional associational links where activation can spread fromantecedents to consequents or visa versa. An important feature of this representation is22that the strength of the activation is not assumed to be symmetrical. In other words, thestrength of association between X and Y is not necessarily the same as the strength ofassociation between Y and X. Below is one example of such a knowledge representationacquired from exposure to a set of cause-effect relations with a conditional structure:ci >>> cic2 >>> elc2 >>> e2It should be noted that all factors found to influence conditional reasoningperformance, including clause relatedness, imagery, and generation of alternatives, are allfactors that motivated the development of associative models of memory. Put anotherway, these content effects are consistent with memory models based on spreading ofactivation and strength of association between objects or events stored in memory (Ross&Bower, 1981).Instance-based accounts suggest that conditional rule knowledge may be acquiredfrom exposure to specific instances (e.g. causes and their associated effects) and that theinstances are associated via spread of activation along asymmetrical links from one to theother (e.g. links from causes to effects, or from effects to causes). Given this type ofassociational memory structure, the next question concerns the retrieval of specificinstances of antecedents and consequents.Retrieval of instances from memory and transfer-appropriate processing:According to Tulving’s Principle of Encoding Specificity, “remembering of events alwaysdepends on the interaction between encoding and retrieval conditions” (Tulving, 1984,p.242). A similar position is expressed in the principle of Transfer Appropriate Processingput forward by Morris, Bransford, & Franks (1977). They stress that “the value of23particular types of acquisition activities must be defined relative to the types of activitiesto be performed at the time of the test” (p.53 1). The principles of Transfer AppropriateProcessing and Encoding Specificity do not restrict the nature or the types of processesinvolved during acquisition and retrieval. The principles only require that the processes be“similar”. What is meant by the term “similar”, at least within the present study, will beaddressed shortly. What implications do these principles have for a study of rule transfer?According to an instance-based account, conditional reasoning should befacilitated by access to rule-related instances in memory, where memory access in turndepends on the similarity between encoding conditions and retrieval conditions. If thetype of processing during rule acquisition is in some respect similar to the type ofprocessing required during rule application, then access to rule instances should befacilitated and reasoning performance should improve. This is a general statement of thelast hypothesis. To develop it further it is necessary to explain what is meant by “similartype of processing”.Predictive and diagnostic inductive reasoning in rule transfer:-There are twoprincipal modes of reasoning associated with conditional rules of the form IF X THEN Y:predictive and diagnostic. Predictions typically require reasoning from causes to effects.Diagnoses, on the other hand, involve reasoning in the opposite direction, from effects tocauses. In a naturalistic setting, a conditional (or any other) relationship between causesand effects may be learned in either one of these modes by being exposed to causes linkedto effects, or effects linked to causes (Bjorkman & Nilsson, 1982; Waldman & Holyoak,1992).Predictive and diagnostic modes of reasoning are important not only in learninglogical rules but also in using the rules in deductive reasoning tasks (Evans & Beck, 1981).24To see why, consider the conditional syllogisms presented earlier. Note that the onlydifference between the four types of syllogisms can be found in the minor premise and inthe conclusion (the major premise is the same for all of them). In the MP and DA forms,the minor premise refers to an antecedent (a cause), while the conclusion refers to aconsequent (an effect). In other words, here a cause is given and an inference has to bemade regarding an effect. This is precisely the direction of reasoning required whenmaking a prediction - from cause to effect. In contrast, the MT and AC argument formscontain a minor premise that references a consequent (an effect), and a conclusion thatreferences an antecedent (a cause). Here the effect is given, and an inference has to bemade regarding a cause. The direction of reasoning in these two cases thus corresponds tomaking a diagnosis - from an effect to a cause.On the basis of the foregoing analyses, the last hypothesis can now be stated morespecifically. It is hypothesized that when the reasoning mode (predictive / diagnostic)under which a conditional rule is acquired matches the reasoning mode required inthe deductive task, memory access to specific rule instances - particularlyalternative causes and effects - is facilitated, resulting in improved reasoningperformance as compared to conditions where the modes of reasoning do notmatch.E. Summary of Hypotheses and Experimental PredictionsFirst, it is hypothesized that familiarity (in terms of lexical attributes) enhances theapplication of the acquired rule. In other words, familiarity with rule attributes facilitatestransfer of rule knowledge. Consequently, it is predicted that near-transfer deductive25performance (i.e. where the rule content in the deductive task matches the contentencountered during inductive rule acquisition) should be better than deductiveperformance on far-transfer (i.e. where the rule content on the deductive task is differentfrom the content encountered during rule acquisition). Specifically, it is predicted thatthere will be fewer reasoning errors on the deductive transfer task with rules that havecontent which matches the content used in the inductive rule acquisition task, as comparedto rules with content that does not match the inductive task content.Second, some situations can arise where the rule content does not match theacquisition content, and cueing of specific relevant instances is unlikely. When thathappens, it is hypothesized that accurate deductive inferencing presumably relies ongeneralized, content-free rules. Such rules are more readily acquired from exposure tolarge rather than small sets of exemplars. It is predicted, therefore, that rules acquiredfrom larger sets of training examples should transfer more readily to inferencing situationswith different propositional content as compared to rules acquired from small exemplarsets. If this is the case, there will be fewer reasoning errors with content that is differentfrom the acquisition content with rules induced from instances with trinary, as comparedto binary attribute dimensions.Third, it is hypothesized that increased availability of alternative antecedents andconsequents associated with larger sets of rule exemplars during inductive training can beparticularly helpful in reasoning on the two indeterminate argument types, DA and AC. Incontrast, a determinate type that requires processing of a negated proposition (MT) shouldbe easier with rules acquired from smaller sets because cognitive load is reduced, andbecause the availability of alternatives is not a factor with deterministic types.26More specifically, as compared to rules acquired from sets of instances with binaryattribute dimensions, there will be fewer reasoning errors with rules acquired from sets ofinstances with trinary attribute dimensions on the DA and AC forms. Also, as compared torules acquired from sets of instances with trinary attribute dimensions, there will be fewerreasoning errors with rules acquired from sets of instances with binary attributedimensions on the MT argument form.Finally, when the mode of reasoning (predictive vs. diagnostic) under which aconditional rule is acquired matches the reasoning mode involved in the deductive task(MP, DA vs. AC, MT), it is hypothesized that memory access to specific rule instances,particularly alternative causes and effects, is facilitated, resulting in improved reasoningperformance as compared to conditions where the modes of reasoning do not match.More specifically, compared to rules acquired in a diagnostic context (AC, MT), there willbe fewer deductive reasoning errors with rules acquired in a predictive context on the IVIPand DA argument types.27II. METHODA. Subjects and DesignThe subjects were 120 University of British Columbia undergraduates in theFaculty of Education who were registered in the Primary Teacher Training Program.Elementary student teachers were selected because their background knowledge is morehomogeneous than secondary students who specialize in different subject matter contents.The aim was to minimize any potential influence of prior knowledge. The samplecomprised 42 males and 78 females, reflecting the overrepresentation of female students inthe primary program. All subjects were recruited by the experimenter from 15 sections ofa 400-level course on Evaluation in Teaching. Participation was on a volunteer basis only.Subjects were paid $20 for their participation.Experimental conditions:-Subjects were randomly assigned to one of three groups:binary, trinary, and a control group. Within each group, subjects were assigned to one of24 trial sequences as required by counterbalancing the experimental tasks.(Counterbalancing is described in more detail below). A complete replication of thecounterbalanced task materials and task orders required 24 subjects in each group. Inorder to improve the reliability of the results, and to increase the sensitivity of statisticalanalyses, one more full replication of each training treatment group was completed byassigning 48 (2x24) subjects to each of two treatment groups. However, the limitedlogistics of the research program (i.e. funds, subject availability, and time) did not permitanother replication of 24 for the control group.28The inductive training conditions for rule acquisition were defined in terms of twofactors: (a) the size of attribute dimension (i.e., binary vs. trinary) and (b) the mode ofinstance presentation (i.e., predictive vs. diagnostic). The former varied as a between-subjects factor and the latter as a within-subjects factor. Since transfer relations betweeninductive rule learning and subsequent application of this knowledge to deductivereasoning is the primary focus of the present study, the conditions under which transfercan occur need to be specified.Transfer type is the first within-subjects design factor. The hypotheses require thespecification of two types of transfer: (a) near-transfer, where the content encounteredduring inductive rule learning is the same as the content in the deductive reasoning task,and (b) far-transfer, where the deductive syllogisms contain content that is unrelated toany of the content encountered during inductive rule learning. The hypotheses also requirethe specification of two modes of inductive rule learning, predictive and diagnostic. As aresult, the near-transfer condition is fi.jrther subdivided into familiar content encounteredduring prediction-based rule learning, and familiar content encountered during diagnosis-based rule learning. These factorial variations of the familiarity and the mode of inductiondid not cross over the far-transfer condition. In addition, it was necessary to defineanother within-subjects factor: type of deductive argument form (MP, DA, AC, and MT).Transfer type and syllogism type were defined as within subject factors becauseusing a within subject design has the advantage of reducing error variance due toindividual differences. Attribute dimension size was treated as a between subject factor,however, because of possible asymmetric carry-over effects from the trinary to the binaryrule acquisition conditions. As was noted earlier, a trinary set has more distinct instancesand as such can be expected to give rise to more generalized causal schema which is29hypothesized to influence subsequent learning and transfer (Gick & Holyoak, 1987). Thecomplete experimental design is depicted in Figure 1 below.Deductive TransferInductive Learning Near Far— Prediction Diagnosis Unfamiliar UnfamiliarSs Grp Rule related related content contentinduction content content Trial 1 Tnal 2mode MPDAACMT MPDAACMT MPDAACMT MPDAACMT1 BIN PredlDiag48 BIN Diag/Pred49 TRI Pred/Diag96 TRJ DiagfPred97CTRLl2 CTRLFigure 1. Experimental Design LayoutAs can be seen in Figure 1, the design required three groups of subjects, twoexperimental and one control. Including a control group provided an opportunity to assessthe overall effect of the inductive acquisition phase. The performance of the treatmentgroups may be similar but better than if no rule learning took place. On the other hand, thetreatment groups may perform differently, but the level of performance for both groupsmay actually be lower as compared to a no-treatment condition (i.e. negative transfer).Using a control group enabled a more precise assessment of these potential outcomes.The design layout also shows two trials under the far-transfer condition. Thisdecision was justified on the grounds that having an extra replication would increase the30reliability of the data and, more importantly, the extra trial would make it possible toassess potentially different practice effects on far-transfer for the two treatment groups.B. Experimental Tasks and MaterialsRule acquisition task:-This task was a modification of the traditional conceptlearning paradigm where subjects learn a target rule by observing positive and negativeinstances of the rule. The task was to learn a rule that reflected the relationship betweendimensional attributes of interest, which in this study were specified as causes and effects.The rationale for this is based on the fact that reasoning about causes and effects is aubiquitous cognitive activity which typically requires both inductive and deductivethought, and often involves conditional relations (e.g. Sternberg, 1986; Carison &Schneider, 1989; Cheng & Novick, 1992). In the binary-dimension set, two differentcauses and two different effects appeared. In the trinary set, three different causes andeffects were presented. The relations between causes and effects under each rule truthtable class are summarized in Figure 2. A listing of examples of causes and effects used inthis study appear in Figure 3 below.Set Size Rule Truth table Class++ +— —+Binary cl—el cl—e2 c2—el c2—e2Trinary cl—el cl—e2 c2—el c2—e2cl—el cl—e3 c3—el c2—e3c3—e2c3—e3Figure 2. Causes and effects under different rule truth-table classes31Content A. Drugs and side effectsTarget Rule: IF the drug Mixolin is administered TEEN purple skin rash developsCauses and effects used for rule acquisition:cl=Mixolin el=Purple rashc2=Phoresin e2=Blue rashc3=Benuvin e3=Green rashContent B. Radiation sources and plant mutationsTarget rule: IF a plant is exposed to Gamma radiation TEEN it grows square leavesCauses and effects used for rule acquisition:cl=Gamma radiation el=Square leavesc2=Beta radiation e2=Triangular leavesc3=Alpha radiation e3=Circular leavesContent C: Chemical agents and fishTarget rule: IF agent TX is released in the water TEEN fish develop brittle scalesCauses and effects used for rule acquisition:cl=Agent Tx el=Brittle fish scalesc2=Agent Ty e2=Enlarged fish scalesc3=Agent Tz e3=Loose fish scalesContent D: Lasers in eye surgery and retinal damageTarget rule: IF an Argon laser is used TEEN checkered spots appear on the retinaCauses and effects used for rule acquisitioncl=Argon laser el=Checkered spots on a retinac2=Ruby laser e2=Striped spots on a retinac3=krypton laser e3=Dotted spots on a retinaFigure 3. Types of content used in the inductive learning taskEach training set contained 36 instances comprising nine, randomly orderedinstances of each truth table class. In the first phase of the rule acquisition task, subjectswere presented with a series of predictions or diagnoses, one by one. They were asked tolook at each case (i.e. a rule instance) and decide whether the stated prediction ordiagnosis was true or false. Each case consisted of a cause followed by an effect (for theprediction condition), or an effect followed by a cause (for the diagnostic condition).Subjects received corrective feedback after each response, and then a new case waspresented. Each case was selected by the computer at random. The process continueduntil a mastery criterion was reached. This criterion was set at two replications of nineinstances, yielding a total of 18 consecutive correct responses.32During this phase of the inductive rule acquisition task, subjects were able torecord previously presented instances along with their status (true/false). The rationale forthe simultaneous availability of previously presented instances was two-fold: (a) iteliminated unnecessary working memory load which is desirable since the focus of thestudy was on cognitive processes independent of working memory load; and (b) itmaximized the likelihood of task completion by reducing the time to learn the conditionalrule.In the prediction-context condition, each trial involved a brief description of acausal agent being investigated by a research team. The subject was told that prior toconducting their experiment, the researchers predicted that the particular causal agent willresult in some specific effect. The subject was then asked to indicate whether the statedprediction was correct or incorrect.In the diagnostic-context condition, each trial started with a presentation of someeffect (or symptom). This was followed by a description of a diagnosis made by a team ofinvestigators. The subject was then asked to indicate whether the stated diagnosis provedto be correct or incorrect.Rule verbalization and selection task:-In the rule verbalization task presented in aquestionnaire, subjects were asked to write down the strategy they used to decide whethera given prediction or diagnosis was correct. The multiple choice task was then introduced.It was intended to assess subjects’ ability to identif’ a rule statement expressed in anIF.. THEN. .language that best summarized the conditional relationships they encounteredin the inductive phase. The task simply required subjects to select one of four options on amultiple choice item, which, in their opinion, represented the most appropriate expressionof the observed relationship between causes and effects. The options are listed below.331. The most parsimonious and general (target rule) statement which had the generalform: IFcl THENe1(e.g. IfMixolin is administered, then purple rash develops.)2. A correct, but less parsimonious rule statement in the form: IF c2 THEN el or e2(e.g. If Phoresin is administered, then either purple or blue rash develops.)3. A rule statement based on an incorrect, reversed inference: IF el THEN ci(e.g. If purple rash develops, then Mixolin was administered.)4. An incorrect rule statement related to a biconditional interpretation:(IFcl THENe1)AND(IFel THENc1)(e.g. If Mixolin is administered then purple rash develops, and if purple rash developsthen Mixolin was administered.)The task was also designed to help subjects learn to associate the appropriateIF.. THEN.. language with the knowledge structure they induced from the correspondingset of instances. This was accomplished by cueing pragmatic constraints of generality andparsimony discussed earlier. Accordingly, subjects were asked to choose the simplest,most concise, yet general proposition that held true for the set of cases (instances) as awhole.Conditional syllogism reasoning as transfer task:-In the deductive task, the foursyllogistic types were presented one by one in random order. Each argument form waspresented with both an affirmative and negated conclusion, yielding a total of 8 distinctsyllogism items.In order to facilitate a naturally-occurring reasoning context for the subjects whilenot altering the traditional format of the task, subjects were told that each syllogism was asummary of an argument made by a particular student in a high school science class.Subjects were told that each student in the science class had been exposed to the same set34of cases (instances) as they themselves just encountered. They were asked to putthemselves in the role of a teacher who is marking the validity (correctness) of eachconclusion students arrived at. A validation response to a given syllogism thereforebecame equivalent to a decision on whether a particular “student’s conclusion” held truealways, sometimes or never. No feedback was given, and each response as well asresponse latency were automatically recorded.C. ApparatusAll the tasks were programmed in Turbo Pascal and were presented on an IBM386 microcomputer. Response-time (in millisecond accuracy) was recorded for everyresponse required in each task. Time data were collected using Assembly Language timingsubroutines interfaced with Pascal code.D. ProcedureTrial sequence counterbalancing:-Four different content areas were necessary forthe conditional rule used in the deductive transfer tasks, two for near-transfer (prediction-related, and diagnosis-related contents), and two for far-transfer (unfamiliar content trials1 and 2). Four types of content taken four at a time yielded 4! = 24 distinct transfer trialsequences counterbalanced for content effects.The inductive task required that two of the four available content types bepresented in two different acquisition contexts. This yielded 4!/(4-2)! = 12 inductionsequences counterbalanced for content. Counterbalancing to control for sequence effectson the induction tasks then resulted in a total of (2)12=24 unique induction trial35sequences. The complete set of 24 induction sequences followed by 24 deductive transfersequences is shown in Figure 4. This arrangement also ensured that each type of content isequally represented in each type of transfer.Figure 4. Task sequences used for each groupThe experiment was carried out in a laboratory setting. Two subjects were runsimultaneously, each seated at one of two computer work stations located in the lab.Subjects were given an individualized 16 page booklet which explained the purpose of thestudy, provided an overview of the tasks, and gave step-by-step instructions on how toproceed with each task. The booklet also contained space for subjects to record anycomputer feedback, if they wished (see Appendix B).Exp. Rule InductionSequence TrialsDeductive Rule TransferTrials1 pA dB2 pA dC3 pA dD4 pB dA5 pB dC6 pB dD7 pC dA8 pC dB9 pC dD10 pD dA11 pD dB12 pD dC13 dA pB14 dA pC15 dA pD16 dB pA17 dB pC18 dB pD19 dC pA20 dC pB21 dC pD22 dD pA23 dD pB24 dD pCA B C DB A C DC A B DD A B CA B B CB A B CC A B BD A C BA C B BB C A BC B D AB B A CA C B BB C B AC B A BD B C AA D B CB D A CC B A BD C A SA B C BB B C AC D B AB C B Ap = predictiond = diagnosticA = Content A:B = Content B:C Content C:D = Content B:modemodeDrugs and side effectsRadiation and plant mutationsChemical agents and fish scalesLasers and retinal damage36In order to ensure that subjects read all instructions carefully, the computerkeyboard was disabled between tasks and required subjects to enter a code to reactivate it.A different code was embedded at the end of each block of instructions corresponding to aparticular task.Following the completion of each inductive task, subjects were asked (in thebooklet) to write down their strategy for deciding whether a given prediction or diagnosiswas correct or not. They then completed the multiple choice, rule-selection task. Thiscompleted the inductive phase.Subjects then proceeded to the sequence of deductive tasks. A brief introductionto each transfer task was given. For the near-transfer tasks, it was pointed out to thesubjects that the content they were about to deal with was related to the content on one oftheir inductive tasks. Far-transfer tasks were introduced by stating that although thecontent they were about to see was unfamiliar to them, their responses were neverthelessof great interest to the experimenters. Each subject completed four (2 near and 2 fartransfer) sets of eight deductive problems. At the end of the computer session, subjectsfilled out a brief questionnaire containing basic demographic items, including gender, age,and educational background.37ifi. RESULTSThe data analysis plan is as follows. First, the training data is analyzed to determinewhether meaningful learning did in fact occur. This is followed by a detailed analysis of theresponse data on the deductive transfer tasks, and tests of the relevant hypotheses.A. Analysis of Inductive Rule Training PerformanceAn analysis of the inductive rule learning phase of the experiment was carried outfirst to establish the validity of the design by verifying: (a) that rule acquisition has takenplace, and (b) that the training outcomes including the total number of instances and thetotal processing time required to reach criterion performance have values consistent withthe different training conditions.Analysis of Number of Instances and Processing Time Required to Reach CriterionPerformance:-Outliers in the observed distribution of the total number of instances andprocessing time were identified using Studentized residuals obtained from fitting a linearmodel to the four training variables which included the total time and total instances forthe prediction and diagnostic conditions. A residual value of 2.5 was used for cutoff toidentifying the outliers (Beisley, Kuh & Welch, 1980). Four outlier values (two in eachgroup) were identified by this method and were replaced by the corresponding groupmeans based on the remaining cases. The resulting means and standard deviations of thetotal processing time and total number of instances required to reach criterionperformance in each training condition appear in Table 1 (see also Figures 5 and 6 for a38graphical representation). It should be noted that in subsequent analyses of this data, thedegrees of freedom were reduced by the number of observations for which the meanvalues were substituted.Acquisition ModeTraining Variable Prediction DiagnosisBinary Group (n=48)No. of instancesN 26.23 27.27SD 6.07 8.87Total TimeM 456.02 593.33SD 255.39 352.50Table 1. Means and standard deviations of number of instances and total processing time fordifferent inductive rule acquisition conditionsFigure 5. Number of instances for rule acquisitionTrinary Group (n=48)No. of instancesM 30.33 31.787.71 10.77Total TimeM 577.42 615.53SD 231.83 234.58Mean No. of Instances Required for Rule Acquisition4O35• Binary Group30 rrinary GroupZ 2520Prediction DiagnosisAcquisition ModeEL39Mean Processing Time Required for Rule AcquisitionFigure 6. Processing time for rule acquisitionBoth the instance and response time variables were transformed prior to analysis tostabilize any departures from a normal distribution. The instance variable was transformedwith the square root transformation and the response time variable with the log transform(Box & Cox, 1964). The correlation between the total number of instances and the totalprocessing time was moderate, with Pearson r =.32, (<.OO1) in the prediction condition,and .49, (p<.000) in the diagnostic condition.The response time and instance data were analyzed together using multivariaterepeated measures analysis of variance (MANOVA). The analysis of the responsemeasures revealed that the total number of instances and total processing time required toreach the mastery criterion of 18 consecutive correct responses by the trinary group weresignificantly greater and longer than those required by the binary group, H-L Trace=.25,(s=1, m=-2, n=2), p<.000; F’s(1,94)=14.16, (MSE=1.05), and 6.32, (MSE=.60), p’s<.OOO700-!. 600500400• Binary GroupUTrinary GroupPrediction DiagnosisAcquisition Mode40and .014 respectively, as expected. The corresponding effect sizes (‘s) were .38 and .26for the instance and time measures, respectively.The diagnostic mode of rule induction did not require significantly more instancesbut it did require significantly more processing time, as compared to the prediction mode,‘s(1,94)=.74 and 5.91, p’s<.391 and .017, for the instance and time measures,respectively. The corresponding effect sizes were .18 and .50 for the instance and timemeasures, respectively.These results are consistent with the expectation that rule induction from a set ofdiverse instances should require more processing time compared to a set of instances thatare less variable. On the average, each new instance from a trinary set provides moreunique information about the underlying relationship than an instance from a binary set.Processing the trinary information should therefore require more time but not necessarilymore instances, which is what the results have shown.Subjects’ Introspection Reports of Their Inductive Learning Strategies:-In order tolearn more about subjects’ inductive learning strategies, at the end of each inductive tasksubjects were asked how they were able to tell whether a particular prediction or diagnosiswas correct or not. All responses were coded, yielding the following response categories:Category 1: No Articulated Strategy (Trial & Error)This category contains responses which show no evidence ofany systematicstrategy. A strategy may have been used but a subject was not able to verbalize it. Atypical response was: “I don’t know, it was basically trial and errorCategory 2: Memory Processing StrategiesThis category includes responses characterized as memory-based enumerationstrategies since they contained lists ofvarious cause and effect relationships that subjects41encountered The most typical response was: “I recalled that Mixolin causes only purplerash, but Phoresin causes purple rash, and it can also cause blue rash”.Category 3: Coding StrategiesResponses in this category revealed afocus on particular attributes within the setof instances. The attributes subjectsfocused on included either relevant or irrelevantones. The key characteristic of this category ofresponses is the reported attention tospecfic attributes. Typical responses were: “I basicallyfocused on which diagnoses wereincorrect”; “Ipaid attention to the cities where the experiments were done “.Category 4: Rule-Related StrategiesResponses in this category reflected a systematic searchfor a rule. These subjectstypically drew a two-way or a three-way table where they systematically recorded theobserved relationships between causes and effects. They also produced a concise, generalstatement which summarized the underlying relationship. Typical response was: “Thebasic pattern was that Phoresin causes dfferent rashes but Mixolin causes only purplerash “.Frequencies of reported strategy categories appear in Table 2.Acquisition ModeSelf—Report Strategy Prediction DiagnosisBinary (n=48)1. Trial & error 6 72. Memory 28 273. coding 10 94. Rule—related 4 5Trinary (n=48)1. Trial & error 3 32. Memory 32 353. coding 4 24. Rule—related 9 8Table 2. Category frequencies for reported inductive learning strategies42An inspection of the data in Table 2 clearly reveals that the most frequentlyreported strategy under prediction as well as under diagnosis was a memory-basedstrategy. Explicit rule-based strategies were infrequently reported, particularly by thebinary group. Trial and error strategies were also infrequent, particularly in the trinarygroup. One thing that is clearly evident in the verbalized reports is that the subjects do notengage in demanding cognitive operations, as shown in the frequency data (i.e. category 1and 2 vs. category 3 and 4), although the trinary group showed a little more effort insearching some rule or pattern.Assessment of the Propositional Form of an Acquired Rule:-Although all subjectsreached the criterion performance level on the inductive task, their ability to identify theappropriate rule when expressed in the standard conditional language of the form ]F xTHEN y was also assessed using a four-option, multiple choice question. The distributionof responses for each option is given in Table 3.43Acquisition ModeOption* Prediction DiagnosisBinary (n=48)1 1 12 2 33 30 294 15 15Trinary (n=48)1 0 32 4 83 6 84 38 29Item;Based on what you have observed in the set of examples just presented, which of thefollowing statements represents a CONCISE, GENERAL rule that BEST SUMS UP the relationshipbetween the type of (cause) and the type of (effect)?*Options:1. If (effectl) then (causel)2. If (causel) then )effectl)and if (effectl) then (causel)3. If (cause2) then either(effectl) or )effect2)4. If (causel) then (effecti)Table 3. Response frequencies for each option on a rule assessment itemAs can be seen in Table 3, the vast majority of subjects in both groups selected oneof the two propositional statements which correctly described the conditional relationshipsused for training (options 3 and 4). A test of the proportion of subjects in each group whoselected one of these two options revealed that subjects in the trinary group selected thecorrect statement for the target rule (option 4) significantly more often than those in thebinary group, under the prediction mode as well as under the diagnosis mode, ChiSguare(1, j 96) 22.28, p<.000, and Chi-Sguare(1, = 96) = 8.22, p<.OO1, for theprediction and diagnosis mode, respectively.The rule selection results suggest that rule induction from a trinary set, because ofits greater variability of instances, increases the need for a more integrated, parsimoniousrepresentation of the observed set of relations. The target rule statement exemplifies such44a representation, and the fact that the trinary group selected it more frequently providesadditional evidence for the validity of the inductive training task.To see whether self-reported strategies were congruent with selection of the targetrule statement, the number of subjects who selected the target rule statement wascalculated for each self-report strategy category. The results for each group (binary andtrinary) are shown in Tables 4a and 4b.Prediction (N=48)1. Trial & error 1 52. Memory 8 203. coding 3 74. Rule—related 3 1Diagnosis (N=48)Table 4a. Target rule selection by reported strategy for the binary groupPrediction (N=48)1. Trial & error 2 12. Memory 24 83. coding 4 04. Rule—related 8 1Table 4b. Target rule selection by reported strategy for the tnnary groupRule Options SelectedStrategy categoryTarget Other(option 4) (options l,2,or3)1. Trial & error2. Memory3. coding4. Rule—related194161854Strategy categoryRule Options SelectedTarget Other(option 4) (options l,2,or3(1. Trial & error2. Memory3. coding4. Rule—related22016Diagnosis (N=48)1151245When pooled together across the two groups, the data show only a weakrelationship between target rule selection and reported strategy in the prediction mode,Chi-Square (3, N 96) = 6.53, R<091.and an absence of any significant relationshipbetween these two variables in the diagnostic mode, Chi-Square (3, N = 96) = 1.37,p>.713. In the prediction mode, of the thirteen subjects who reported a rule-relatedstrategy, eleven in fact selected the target rule statement, but in the diagnostic mode onlyseven out of thirteen with a rule strategy selected the target rule.These findings suggest, although not conclusively, that subjects who relied on arule-based strategy in the prediction mode may have induced a representation of a rule thatmore consistently matched the target rule statement as compared to the representationinduced under the diagnostic mode. This is a reasonable explanation given that the targetrule is expressed in the form of a predictive rather than a diagnostic proposition.B. Analysis of Deductive Reasoning PerformanceThis section describes the analysis of transfer data collected from the deductivereasoning performance. The main measure used in the analysis of deductive reasoningperformance was a response accuracy score on the deductive syllogisms. Each syllogismwas scored according to the normative conditional interpretation of each argument type.The scoring key is shown in Table 5.46Modus Ponens P -> Q P Q AModus Ponens P -> Q P —Q NDenying Antecedent P -> Q -P Q SDenying Antecedent P -> Q -P —Q SAffirming Consequent P -> Q Q P SAffirming Consequent P -> Q Q —P SModus Tollens P -> Q -Q P NModus Tollens P —> Q -Q —P AA = Always trueS = Sometimes trueN = Never trueTable 5. Scoring key for 8 types of conditional syllogismsApplying the scoring protocol to each response yielded an accuracy score of 0 or 1for each subject on each syllogism. Since each syllogism type was presented with both anaffirmative and a negated conclusion, two scores for each syllogism type were obtained.These scores were added together, resulting in one score for each type of syllogism thatranged from 0 to 2 for each subject. The means and standard deviations of these scores arelisted in Tables 6, 7, and 8 for the binary, trinary, and the control groups, respectively.47Syllogism First Second Conclusion Correcttype premise premise responseFar Transfer Near—transferArgument Trial 1 Trial 2 Prediction DiagnosistypeModus PonensN 1.77 1.89 1.81 1.90SD 0.55 0.31 0.49 0.43Denying AntecedentM 1.04 1.19 1.50 1.48SD 0.87 0.89 0.77 0.82Affirming ConsequentM 0.94 1.02 1.38 1.08SD 0.83 0.93 0.82 0.85Modus TollensN 1.68 1.52 1.69 1.50SD 0.62 0.74 0.59 0.68Table 6. Mean scores and standard deviations on deductive reasoning performance for the binarygroup (N=48)48Far Transfer Near—transferArgument Trial 1 Trial 2 Prediction DiagnosistypeModus PonenaM 1.70 1.83 1.83 1.71SD 0.62 0.43 0.48 0.61Denying AntecedentN 1.08 1.23 1.29 1.27SD 0.87 0.86 0.85 0.81Affirming ConsequentM 0.81 0.92 1.27 1.10SD 0.91 0.92 0.89 0.97Modus TollensM 1.66 1.56 1.54 1.52SD 0.61 0.58 0.62 0.65Table 7. Mean scores and standard deviations on deductive reasoning performance for the trinarygroup (N=48)49Far Transfer Near—transferArgument Trial 1 Trial 2 Prediction DiagnosistypeModus PonensN 1.63 1.58 1.58 1.50SD 0.71 0.65 0.66 0.72Denying AntecedentM 1.37 1.38 1.21 1.25SD 0.88 0.77 0.88 0.79Affirming ConsequentN 1.16 1.00 1.04 1.17SD 0.92 0.93 0.86 0.87Modus TollensN 1.33 1.54 1.46 1.33SD 0.76 0.83 0.83 0.75Table 8. Mean scores and standard deviations on deductive reasoning performance for the controlgroup (N=24)It will be recalled that the experimental design specified one between-subjectsfactor, namely the Treatment Group (Binary, Trinary, and Control), and two within-subjects repeated measures factors. The repeated measures factors were (a) Transfer Typewhich included two trials of far-transfer, and two trials of near-transfer (prediction-related/ diagnosis-related), and (b) Syllogism Type (MP,DA,AC,MT). Performance scoresobtained under this experimental design were analyzed with a multivariate analysis ofvariance (MANOVA) by 1-df linear contrasts performed on the mean vectors.It should be noted that although the basic task response unit is binary (correct/incorrect),reasoning scores are based on multiple item responses, and multiple subjects.50Consequently, the deductive reasoning data can be appropriately analyzed by standardmultivariate methods (Collett, 1991).The analysis was carried out in two steps. First, following the General LinearModel (GLM) approach, a model was fitted to the data, model parameters wereestimated, and tests of main effects and interactions were performed. An overall Type Ierror of .10 was used for the omnibus tests. Second, individual 1-df contrastscorresponding to the predictions derived from the hypotheses were utilized to test thespecific hypotheses. To control for the pyramiding of contrast probabilities, a conventionalalpha level of .05 was used on tests of the individual contrasts (Huberty & Smith, 1982).As a first step, the overall transfer performance was analyzed. Performance on thefour argument types was combined, yielding a total deductive performance score rangingfrom 0 to 8 for each subject in each transfer condition. The data is shown in Table 9.Far Transfer Near—transferTrial 1 Trial 2 Prediction DiagnosisBinary( N=4 8)M 5.44 5.63 6.38 5.96SD 1.89 2.02 1.66 1.65Trinary( N=4 8)M 5.23 5.54 5.94 5.60SD 1.71 1.66 1.56 1.84Control(N=24)M 5.50 5.50 5.29 5.25SD 2.32 2.02 2.21 2.29Table 9. Mean scores and standard deviations for overall deductive reasoning performance51The analysis revealed significant interactions between training conditions andtransfer types E(6,351)=1.84, (MSE=.32), p<O9l. In addition there was a significant maineffect for transfer type E(3 ,3 51)3.37, <.019, however the main effect of inductivetraining conditions was not significant E(2,1 17)0.77, >.47. The results of the omnibustests warranted more detailed testing of the individual hypotheses using 1 -df multivariatecontrasts. Inspection of the data suggests that the effect of inductive training is present onthe near-transfer reasoning problems but not on the far-transfer problems. The specificnature of this transfer can be further examined by detailed analyses.Tests of Specific Hvpotheses:-The first hypothesis addressed rule transferperformance as a firnction of content similarity. The verifiable prediction proposed on thebasis of this hypothesis was that the overall near-transfer performance (i.e. performanceacross all argument types) would be better than the overall far-transfer performance forboth the binary and trinary groups. No such difference would be predicted for the controlgroup since it did not have the benefit of the inductive rule learning. Near and far-transferperformance data are summarized in Figure 7.52Overall Deductive Transfer PerformanceFigure 7. Overall transfer performanceThe analysis was further carried out using a contrast of overall performance onnear and far-transfer deductive tasks. The analysis therefore also served as an additionalverification of the overall validity of the transfer of training design. If any meaningfulinductive learning occurred, a difference between near and far-transfer performance shouldbe observed for the treatment groups (but not for the control group) since the near-transfer tasks used rule content that was related to the content in the inductive learningphase.Indeed, overall performance on near-transfer was significantly better than on fartransfer, (1,1 17) = 4.57, (MSE = 6.58), c035, and further, the interaction betweentransfer and the three treatment groups was also significant, E(2, 117) = 3.65, p<.029.Furthermore, this performance difference was due to the two treatment groupsoutperforming the control group, E(1,117) 6.38, <.013. However, the two treatmentgroups did not differ from each other significantly in their overall near versus far-transferperformance, E(1, 117) = 0.91, >.342.8700_______Cl)C INearIlOFar Ia,4ControlBinary TrinaryTreatment Group53Since the transfer types were varied as a within-subjects factor with nesteddefinitions of reasoning modes and trials, the following hypotheses were tested with eachnested factor (cf Figure 1). Thus, the second hypothesis addressed far-transferperformance as a function of the training content of varying attribute dimension size(binary vs. trinary). From this hypothesis, which was elaborated earlier, it was predictedthat there would be fewer reasoning errors with rule content that is different from theacquisition content (far-transfer condition) with rules induced from instances with trinary,as compared to binary attribute dimensions.Results pertaining to the prediction based on the second hypothesis appear inFigure 8. The prediction was tested with a contrast of performance scores of the threegroups on the four syllogisms with unfamiliar content. No difference in performance wasfound between the binary and trinary groups F(2, 117) = .17, (MSE = 12.15), >.683. Inaddition, the performance of these two treatment groups was not significantly differentfrom control group performance E(1,117)=.O1, >.9l’7.Figure 8. Far transfer by dimension sizeFar-Transfer Performance by Attribute Dimension SizeaU0020•Binary0TrinaryControIMP MT DA ACSyllogism Type54These results therefore falsify the second hypothesis. The inductive training with atrinary set does not facilitate far-transfer to a greater degree than a binary set ofdimensioned materials. Furthermore, the results show that the rule knowledge the twotreatment groups acquired during the inductive phase did not transfer to completely new,unfamiliar rule content.The third hypothesis addressed near-transfer of an acquired rule as a ffinction ofthe size of the contrast set from which the rule is induced. Two predictions arising fromthis hypothesis were verified: (a) as compared to rules acquired from sets of instances withbinary attribute dimensions, there will be fewer reasoning errors with rules acquired fromsets of instances with trinary attribute dimensions on the DA and AC forms; and (b) ascompared to rules acquired from sets of instances with trinary attribute dimensions, therewill be fewer reasoning errors with rules acquired from sets of instances with binaryattribute dimensions on the MT argument form. The data pertaining to predictions basedon this hypothesis are summarized in Figure 9.Figure 9. Near transfer by dimension size2Near-Transfer Performance by Attribute Dimension Size0UCo0a,• Binary0TrinaryMP MT DA ACSyllogism Type55The predictions were verified with a contrast of near-transfer performance scoresof the binary vs. the trinary groups on the four syllogisms. No significant difference indeductive performance was found between the binary and trinary group on the foursyllogisms, E(1,117) = 1.36, (MSE = 11.08), p>.246.The result clearly falsified the third hypothesis. Deductive reasoning performancewith a conditional rule was not influenced by the attribute dimension size of the set ofinstances from which the rule was initially induced. This result, therefore, also casts doubton the influence of the size of a contrast set encountered during rule induction. Accordingto the hypothesis, larger contrast sets with more alternative dimension values shouldfacilitate reasoning performance with indeterminate argument forms, but according to thepresent results, this effect of the trinary inductive training set was not realized. It might benoted at this time that this particular outcome may have been due to the binary nature ofthe traditional deductive syllogism task. If so, then the trinary group may have been lessable to utilize their conditional knowledge as compared to the binary group. This point isdiscussed in more detail later.The fourth, and final hypothesis addressed near-transfer of an acquired rule as afunction of acquisition mode. Two verifiable predictions were proposed on the basis ofthis hypothesis: (a) as compared to rules acquired in a diagnostic context, there would befewer deductive reasoning errors with rules acquired in a predictive context on the MPand DA argument forms; and (b) as compared to rules acquired in a prediction context,there would be fewer deductive reasoning errors with rules acquired in a diagnosticcontext on the AC and MT argument forms. The data pertaining to the predictions basedon the fourth hypothesis are summarized in Figure 10.56The predictions were verified with a contrast of performance on the foursyllogisms with a prediction-related content versus the four syllogisms with a diagnosis-related content. Overall deductive performance by the two treatment groups was better onprediction-related content as compared to diagnosis-related content, but this differenceonly approached statistical significance E(1,1 17) = 3.78, (MSE = 1.99), p<.055. The locusof this difference was in performance on the AC argument type, where prediction-relatedcontent significantly facilitated reasoning performance, (1,1 17) = 4.07, p<.O46.Performance on the remaining argument types was not significantly different.C. Additional Analysis of Transfer DataAfter testing the hypotheses, a further analysis of transfer data was carried out toinvestigate additional transfer effects that might inform the interpretation of the aboveresults. A response pattern analysis was canied out for each subject to determine whetherthe subject’s overall response pattern to the eight syllogism types followed apredominantly conditional or biconditional rule interpretation. The percentages of subjectsfollowing each interpretation in near and far-transfer conditions are presented in Table 10.Figure 10. Near transfer by acquisition mode57Binary Trinary Control(n=48) (n=48) (n=24)Transfer type Near Far Near FarResponsepatternConditional .68 .45 .55 .46 .51Biconditional .20 .35 .30 .44 .28Table 10. Proportion of subjects following either a conditional or a biconditional rule interpretationThe conditional response structure (pattern) was the most common responsestructure produced by control subjects (51%), as well as subjects in the treatment groups(68% and 55% for the binary and trinary groups, respectively, on the near-transferproblems, and 45% and 46% for the binary and trinary groups, respectively, on the far-transfer problems). The biconditional response structure represented the majority of non-normative (i.e. non-conditional) responses (28% for the control and the binary groups, and37% for the trinary group, averaged across all transfer problems).Analysis of transfer performance as a function of rule selection scores:-To gainadditional insight into the relationship between inductive and deductive reasoning, overalldeductive performance on the four transfer tasks was analyzed as a function of ruleselection scores. The original rule selection score was collapsed into a new variable,labeled RC, which was defined as follows: If the correct propositional representation ofthe target rule was selected in both the prediction and the diagnostic condition, a score of3 was assigned; if it was not selected in either condition, a score of 1 was assigned;otherwise, a score of 2 was assigned. A high RC score therefore reflects a consistentidentification of the appropriate propositional form of the target rule. Because deductive58performance did not differ between the two training treatment groups (binary and trinary),the groups were combined for the purpose of this particular analysis. Transfer scoressummed over the four argument types for each rule selection score are presented in Table11 (see also Figure 11).Rule Selection Score*(N 96)1 2 3Transfer Type (N=34) (N=27) (N=35)Far—Transfer(Trial 1)M 5.09 5.26 5.63SD 1.78 1.93 1.72Far—Transfer(Trial 2)M 5.35 5.74 5.69SD 2.03 1.63 1.83Near—Transfer(Prediction>M 6.27 6.07 6.11SD 1.71 1.44 1.67Near—Transfer(Diagnosis)M 5.62 5.59 6.09SD 1.67 1.80 1.79* If the target rule was selected in both training modes then score=3,if the target rule was selected in one of the two modes then score=2else score=l.Table 11. Transfer scores summed over all argument types by rule selection score59Analysis of the data in Table 11 revealed that the overall transfer performance didnot vary as a function of the rule selection score, F(2,93) = 0.38, (MSE = 8.45), p>.684.Apparently, subjects who selected the target rule, did not perform better than those whodid not. An inspection of the data reveals a noteworthy trend in the transfer scores, namelythat as the rule selection score increases, the difference between the prediction-related anddiagnosis-related transfer scores appears to decrease (see Figure 11). It appears that withmore reliable target rule selection, performance became more independent of the cause-effect directionality encountered during inductive training.The difference scores were regressed on the rule selection scores and the relationship wastested for each group. The relationship was clearly not significant for the trinary group,(1,94) .32, (MSE = 2.37), j>.572, however it closely approached significance for thebinary group, E(1,94) = 3.58, (MSE = 1.85), p<062. The results suggest that for thebinary group at least, near-transfer performance became more stable as the target rule wasbeing selected more reliably following the inductive tasks. Thus, better recognition of rulestructure as expressed in multiple choice statements appears to be related to performance.Figure 11. Transfer performance by rule score60W. DISCUSSIONA. Summative OverviewThe broad aim of this study was to gain a better understanding of deductivereasoning with conditional statements, or rules, of the form IF x THEN y. The motivationfor the study was based on the broad premise that deductive reasoning with conditionalrules depends on how the rules were acquired, or induced, by the reasoner in the firstplace. The purpose of the study, therefore, was to investigate how, if at all, the conditionsunder which conditional rules are induced, influence rule transfer to deductive reasoningwith conditional propositions. In other words, this study was an investigation of transferbetween inductive and deductive conditional reasoning.Four specific hypotheses pertaining to the transfer of conditional knowledge wereproposed. The first hypothesis addressed overall differences in near and far-transferperformance. Near-transfer referred to a situation where the content from which aconditional rule was induced matched the content on the subsequent deductive transfertask. Far-transfer referred to a situation where the inductive and deductive content did notmatch. The second hypothesis addressed far-transfer performance as a function ofattribute dimension size (binary vs. trinary). The remaining two hypotheses concernednear-transfer performance: one addressed near-transfer as a function of attributedimension size, and the final one focused on near-transfer as a function of acquisitionmode (predictive vs. diagnostic).61The hypotheses and the testable predictions derived from them were developed bydrawing on a combination of empirical findings and theoretical claims from two competingpsychological theories of human reasoning, namely a class of theories referred to as rule-based (Braine et al., 1991; Inhelder & Piaget, 1958; Nisbett 1993), and an other class oftheories labeled instance, or memory-based theories (Griggs et al., 1982; Pollard et al.,1981). Rule-based theories propose that deductive reasoning is carried out by the use ofrules that are sensitive to the logical form of deductive arguments. Memory-based theorieson the other hand propose that reasoning depends primarily on the cueing of relevantinstances by the content of argument propositions or their context (Rips, 1990). The near-transfer hypotheses were developed on the basis of the memory view, while the far-transfer hypothesis was derived from the rule perspective. Therefore, the summary of thefindings and their implications will be discussed in the context of the theoreticalperspective from which each hypothesis evolved.B. Implications for Theories of Conditional ReasoningImplications for Rule-based Theories of Deductive Reasoning:-Any discussion ofrule-based theories of human reasoning runs into a difficulty as soon as it becomesnecessary to specify what rules one is talking about (Rips, 1994). The current debate overthe utility of rules in reasoning includes a wide variety of inferential rules such as logicalrules, elementary inference schemas, pragmatic reasoning schemas, and statistical rulesincluding the law of large numbers (Nisbett, 1993). This study focused on logical(conditional) rules, but the results, specifically those pertaining to the second hypothesis,have implications for other types of rules as well.62Rule transfer studies have repeatedly found that rules are generalized to novelsituations more consistently if they are induced from sets of instances with high comparedto low variability (Catrambone et al., 1989; Royer, 1979; Singley & Anderson, 1989).Since the trinary training set exhibits greater variability than the binary set, rule-basedtheories would predict the far-transfer performance of the trinary group to be better thanthe binary group. This was not the case in this study, however. The groups did not differon the far-transfer tasks, and in fact no far-transfer was observed for either training group.The lack of far-transfer in this study was not due to poor training. All subjectsreached the inductive training criterion of 18 consecutively correct predictions anddiagnoses, and the vast majority of subjects in both treatment groups selected one of thetwo correct conditional rule statements. Furthermore, the analysis of the inductive trainingdata confirmed that the trinary group took significantly more time, and required moreinstances to reach criterion compared to the binary group. Since induction of rules fromvariable sets of instances would be expected to require more elaborate processing, thepresent results are consistent with the claim that meaningful training did in fact occur.Inadequate training, therefore, does not appear to be a factor responsible for lack of far-transfer. What may have occurred instead, is that the assumed content-free rule structuremight have developed from the training materials, particularly from the trinary dimensionalsets, but such rule structure might not have been properly operative. As was alreadynoted, one reason may have been the inherent binary nature of the criterion propositionalreasoning task which might have mitigated the adaptive function of the content-free rulestructure.What implications does the absence of far-transfer have for rule-based theories ofconditional reasoning? To help frame the rule debate, Smith et al.(1992) proposed eight63criteria for deciding whether reasoning involves the use of abstract rules such as the set oflogical rules which includes the conditional. The criteria are as follows:1. Reasoning performance is the same on familiar and unfamiliar content.2. Performance is the same with abstract and concrete material.3. Early in acquisition, a rule may be overextended.4. Performance deteriorates as a fhnction of the number of rules that are required forsolving a task.5. Performance is facilitated by prior application of the same rule.6. A rule is mentioned in a verbal protocol.7. Performance on tasks with specific content is improved by training on abstractversions of the rule.8. Performance on problems in a particular domain is improved as much by training onproblems outside the domain as on problems within it.Many of the criteria are not directly applicable to the present study. For example,this study was not concerned with performance differences on abstract and concrete tasks;the rule content on all the tasks was at a similar level of abstraction. However, the first andlast criteria are relevant here. If abstract rules are being used, then performance on familiar(near-transfer) and novel (far-transfer) deductive tasks should be similar, and experiencewith one task content should facilitate performance with others types of content. Thisclearly did not happen in the present study. Overall scores on near-transfer tasks weresignificantly higher than on far-transfer tasks, and experience with one type of content didnot facilitate performance with unfamiliar content since the treatment groups did not doany better than the controls on arguments with novel rule content.The present findings indicate that subjects who underwent inductive ruleacquisition clearly learned something, but not generalizable inference rules that they couldapply in new contexts. Subjects did not seem to rely on inferential rules or schemas thatwere independent of content.64Implications for Memory-Based Theories of Deductive Reasoning: Memory-basedtransfer has been shown to be facilitated by transfer-appropriate processing (Morris et al.,1977) where encoding and retrieval conditions match (Tulving, 1984). Therefore, ifdeductive conditional reasoning is memory-based, it should also be facilitated by transfer-appropriate processing. Thus, according to the fourth hypothesis which was derived fromthe memory view, rules induced from examples of causes followed by effects (predictionmode) should facilitate performance on deductive argument types that involve primarilycause-to-effect reasoning, namely the MP and the DA types. On the other hand, rulesacquired by exposure to examples of effects being traced backward to causes (diagnosticmode) should aid performance on argument forms that also involve inferences from effectsto causes, namely the AC and MT types.The results of this study do not support the above hypothesis. The facilitativeeffect of transfer-appropriate processing did not materialize. The diagnostic acquisitionmode did not enhance near-transfer performance on argument forms involving reversed,diagnostic inferences. In fact, the opposite result occurred on the AC form, where thepredictive mode resulted in significantly better performance compared to the diagnosticmode.The findings cast doubt on instance-based accounts of reasoning. Reasoning on theAC form has been shown to depend on the reasoner’s awareness of alternative antecedents(Markovits, 1984), which in the context of this study means alternative causes of a giveneffect. According to the memory-for-instances view, the recall of alternative causes of aparticular effect on the AC form should have been facilitated by the matched effect-tocause directionality encountered on the diagnostic rule induction task. The fact that thisdid not occur suggests that recall of specific instances may not be involved in deductivereasoning. This conclusion does not contradict the fact that awareness of alternatives65plays a role in conditional reasoning. It may be that reasoners benefit from simply beingaware of the fact that some alternatives exist, but just what these are may not be criticalfor making correct inferences.Support for the instance-based account of reasoning is also undermined by theresults of the test of the third hypothesis. In contrast to the last hypothesis, whichaddressed access to relevant instances through transfer-appropriate processing, the thirdhypothesis focused on the availability of such instances. This hypothesis involved acomparison of near-transfer performance of two treatment groups. The binary groupinduced conditional rules from sets of instances that had binary attribute dimensions, thetrinary group from trinary sets. In the context of this study, the rule instances in the binarysets were constructed from two distinct causes and two distinct effects. The trinary setswere constructed from three causes and three effects.Since the pooi ofunique alternatives (of antecedents and consequents) that wasavailable to reasoners in the trinary group was larger than the pooi of alternatives availableto the binary group, instance-based theories would predict better deductive performancefor the trinary group on the DA and AC types. This is because these two argument typesare indeterminate, and correct conditional inferences depend once again on the ability ofthe reasoner to consider alternative antecedents that can lead to a given consequent(Evans, 1982; Oaksford et al., 1992). Since the near-transfer performance did not differbetween the binary and trinary groups, there appears to be no advantage of having a largerpooi of alternatives available for making deductive inferences. This finding, together withthe refutation of the first hypothesis, question the validity of the claim that people recallspecific instances when they reason with conditional rules.66Implications for Theories of Reasoning by Mental Models:-Currently the mostintensely debated theory of propositional reasoning in general, and conditional reasoningin particular is the mental model theory (Johnson-Laird, Byrne, & Schaeken, 1992; 1994).Because the mental model view is so prominent in the literature, and because the results ofthe present study have some implications for this theory, the key elements of the theoryare briefly outlined below.The mental models approach rejects the idea that people use abstract logical rulesthat are sensitive only to the logical form of sentences. Instead, Johnson-Laird et al.propose that people reason in more concrete ways, by constructing and manipulatingmental models of argument premises. Reasoning by mental models involves the followingsteps:1. The reasoner begins by studying the premises and then forms a mental model whichrepresents a possible state of the world in which the premises hold.2. The reasoner then proceeds to search for a proposition which is both true in the model,and also is informative in the sense that it is not a repetition of an existing premise or atrivial inference. This derived proposition represents the reasoner’s putative conclusion.3. In the last step the reasoner looks for counter-examples to check the validity of theputative conclusion. Counter-examples are models which are consistent with the argumentpremises but inconsistent with the conclusion. If no counter-examples are found, thereasoner accepts the conclusion as valid.According to the mental model theory, errors in conditional reasoning occur whenreasoners fail to represent some relevant information in their model. The more theirimplicit model is “fleshed out” with explicit information, the better the reasoningperformance (Johnson-Laird et a!., 1992; 1994). Unfortunately, the theory does not clearlyspecifr what determines the extent to which a model is fleshed out. As Evans (1993)67points out, the theory needs to be more specific with respect to (a) the nature of themental representations, and (b) what causes the initial representations to be expanded.The results of this study can in fact inform these gaps in the mental model theory.The theory assumes that when people encounter an IF.. THEN.. statement theyautomatically construct a mental representation that has either an implicit or an explicitconditional structure. Just where people acquire the knowledge to do this is neveraddressed. Given the generally poor reasoning results reported in the literature, it wouldappear that most people do not in fact have this presumed conditional knowledge.The issue of conditional knowledge acquisition, of course, motivated the presentstudy. Since subjects in this study were exposed to explicit examples of conditionalrelationships during the rule induction task, their experience should improve their ability to“flesh out” their mental models they presumably utilize on the deductive task. The presentfindings provide mixed support for thefleshing out component of the model theory. Thefact that both treatment groups performed better on the near-transfer task suggests thattheir exposure to the exemplar sets used in the inductive task facilitated their ability toconstruct more explicit mental representations of the premises compared to the controlgroup.The results also suggest that if subjects did flesh out their models, their expandedrepresentations were probably not constructed from specific instances. If they were, thenthe trinary group should have outperformed the binary group because the trinary setcontained a greater variety of potential model building “material”. It is more likely thatpeople relied on representations that lie somewhere between abstract rules and concretemodels built from specific instances.68The fact that near-transfer performance Modus Tollens did not rise above the levelof controls for either treatment group also has implications for the mental model theory.Performance on MT was not facilitated by inductive training and remained generallyunchanged across groups and conditions. Yet, unlike Modus Ponens, which also remainedunchanged, the MT scores do not indicate a performance ceiling which would limit anypotential improvement. The MT form is generally considered to be relatively difficult sinceit contains a negation as well as a reversed inference from consequent to antecedent(Evans, 1982). The presence ofboth of these factors may in fact tax cognitive resources tosuch an extent that subjects do not even attempt to construct even implicit mental models.The mental model theory stipulates that models are represented and manipulated inworking memory (e.g. Baddeley, 1990), therefore a significant cognitive load imposed byparticular argument structures may constrain reasoning with internally representedmodels.Summary of Major Implications From the Findings:-The results clearly rule out areasoning process that operates on specific instances stored in memory. The trinary groupalways performed at or below the level of the binary group despite the fact that the trinarygroup had equal access to and greater availability of specific instances that could be usedto generate more counterexamples or more explicit models to aid reasoning. Instance-based processing is also ruled out by the fact that transfer-appropriate processing failed toimprove performance. Previous research has shown that access to encoded instances isfacilitated by matching the encoding and retrieval conditions. When this was done in thepresent study by matching the direction of inferences between the inductive and deductivetasks, no facilitative effect of transfer-appropriate processing was observed. Since anyreasoning program that relies on access to specific instances would be expected to benefitfrom improved memory access, the present findings indicate that an instance-basedprocess was probably not involved.69At the same time, the results also do not provide evidence for a process drivensolely by abstract, general rules. If such rules were used, then far-transfer would beexpected to take place, particularly in the trinary condition. The inductive training data forthe trinary group was consistent with the expectation of more abstract rule learning (e.g.increased processing time, and more reliable selection of the more parsimonious andgeneral rules summarizing the observed conditional relationships). Yet, the trinary groupperformance on far-transfer was no better than either the trinary or the control group,indicating that the reasoning process was probably not driven by abstract rules. It shouldonce again be noted, however, that an abstract rule structure may have developed duringtraining, but it may not have been operated on during the deductive task.So far, results of this research have been used as evidence for rejecting currentaccounts of conditional reasoning. However, considered together, the results suggestalternative accounts of deductive conditional reasoning. One such account is proposed inthe sequel.C. Proposed Account of Conditional ReasoningTaken together, the present findings point to a reasoning process driven by rulesthat are specific to a domain, but generalizable over particular instances of the domain.This explains the superiority of the near over the far-transfer performance in conjunctionwith the lack of observed performance facilitation due to improved access and availabilityof particular instances in memory. But what form do these domain-specific rules take?The verbal reports on inductive strategies suggest that these rules are notrepresented in the syntactic form of IF x THEN y. The majority of subjects produced70statements such as “Drug x causes only symptom y, but the other drugs cause all theobserved symptoms”. All such statements formed the response category labeled “memory-based”, which was also the largest category for both treatment groups. Such statementscan be expressed more generally as: “A member x, from the class of X’s is associated onlywith one member, y, of the class of Y’s, but any of the other X’s could be associated withany member of the set of Y’s”. A rule at this intermediate level of abstraction might beoptimal in terms of reasoning efficiency. It captures the general conditional structure ofobservations, yet it provides enough detail for constructing and manipulating mentalrepresentations in the form of mental models. The assumed associative memory structurereferenced by this linguistic expression is shown below.Several aspects of the proposed rule representation should be noted. First, therelationship between the C’s (e.g. antecedents) and E’s (e.g. consequents) conforms to thematerial conditional. This is corroborated by the fact that the conditional responsestructure (pattern) was the most common response structure in the transfer conditions.Second, in the proposed representation the associative link connecting Cl and Elis bi-directional, and stronger than all the others. As a result, the rule representation isbiased toward biconditional reasoning errors, particularly when it is operated on underincreased processing load, involving trinary dimensions, for example. The present findingsagain support this conjecture. A biconditional response structure represented the majority71of non-normative (i.e. non-conditional) subject responses across all transfer problems,however it was more prevalent in the trinary group as compared to either the binaiy or thecontrol groups.The proposed rule structure also explains the counter-intuitive finding thatperformance on the AC argument form was better in the predictive induction condition,despite the fact that the AC form is distinguished by a diagnostic inference mode. Thereason for this is that the proposed rule structure optimally represents the one-to-one andone-to-many mappings which define a conditional structure, but which only hold in apredictive context. To see why, consider the proposed rule structure from a diagnosticperspective (i.e. view it from the set of consequents, the E’ s, to the set of antecedents, theC’s). From this perspective, all the consequents have one-to-many mappings to theantecedents, and consequently subjects may infer a structure based on a symmetricmapping, such as the biconditional. When the structure is viewed in a predictive context,on the other hand, the one-to-one mapping between Cl and El is readily apparent. Sincethis mapping represents the key defining feature of a conditional interpretation, subjects’reasoning is more likely to reflect this normative conditional interpretation in thepredictive context. This is indeed what the results show. Performance on predictionrelated content was better for all argument types, although it reached significance only onthe AC syllogism. This finding has instructional implications which are discussed in thenext section.Finally, it is proposed that conditional reasoning may involve more than one levelof information processing: the underlying reasoning process may depend on the argumenttype encountered. There is no a priori reason to assume that the same cognitive processmust be deployed to deal with all types of conditional reasoning arguments. In fact, thepresent findings suggest otherwise. The results show that near-transfer performance was72significantly better (compared to control levels) only on the DA and AC forms. The MPand MT levels, however, remained essentially unchanged across conditions. Why?Past research has shown that reasoning with Modus Ponens is essentially at aceiling level, with typical performance in the 85-100 percent range (e.g. Evans, 1982).Results of this study are no different. In fact Modus Ponens is the flagship of proponentsof inference schemas (Braine et al., 1991) since performance on it is unaffected by contentand is highly accurate. When considered together, the results suggest that the reasoningprocess on the MP form is likely to be based on compiled procedural knowledge(Anderson, 1983) and as such, any further performance improvement is not likely to beobserved.In contrast, the MT form may involve little, if any, systematic reasoning. Therationale for this supposition is based on task analysis of Modus Tollens which suggeststhat it is, in fact, the most difficult syllogistic form since it involves both a negation and areversed inference (Evans, 1982). Yet performance on the MT form is generally quitegood, reaching as high as 80 percent proportion correct (Evans, 1993). Again the resultsof this study are consistent with previous research, with MT performance being secondonly to MP. It is proposed that the reason for such a seemingly impressive performance onthe toughest syllogism form is simply due to the well documented matching bias discussedin section Dl. (Evans, 1982). The joint presence of the two resource-taxing operationsmay interfere with the retrieval of the memory-based rule representation. Because of therelative strength of the biconditional link in the assumed representation, the biconditionalbias inherent in this rule representation would increase, thus increasing the probability of abiconditional interpretation. Under this interpretation, the negation of the consequentcompels subjects to add a matching negation to the antecedent term, which in thisparticular case yields the correct response.73D. Implications for EducationThe major implication of this research for educational practice is that conditionalreasoning, which is ubiquitous across academic and professional disciplines, can beenhanced by appropriately designed training. Even the brief inductive training utilized inthis study resulted in significant improvement in reasoning with training-related content.Longer training periods, which were not realistic in this experimental set-up, may haveyielded even better performance outcomes. It should also be noted that deductiveperformance may have been improved by training subjects on the deductive syllogismsdirectly, without the benefit of the inductive phase. However, knowledge acquired bysimply substituting different content into standard logical or other rule-based forms islikely to be “inert”, and bound to specific application contexts. The bottom-up inductiveapproach may require more time and effort, but the knowledge so constructed may bemore flexible, and adaptable to different reasoning contexts. Clearly, more research isneeded to resolve this issue.The present results may also inform the ongoing debate (e.g. Cheng et al., 1986)regarding the optimal method of teaching deductive reasoning skills. Given that subjects inthis study did not make inferences on the basis of content-free, generalizable rules,induction from rule instances alone, may not be sufficient to develop such rules. Inductivetraining may have to be supplemented by explicitly teaching students the syntacticstructure of conditional rules.The results also indicate that at least for simple conditional arguments with binarypropositions such as those used in this study, inductive training sets with binary74dimensions appear to be equal or superior to sets with more varied instances. Anotheradvantage is that rule learning from binary sets also requires less time compared to larger,trinary sets. This is a usefhl result from a practical perspective of teachers who may nothave the time to construct large rule-learning exemplar sets, and who want to minimizeinstructional time.This research also clearly established the superiority of the predictive mode of ruleacquisition over the diagnostic mode. Transfer to prediction-related content wasconsistently equal to or significantly better compared to the diagnostic mode, includingtransfer to deductive reasoning which emphasized the diagnostic mode. These results maybe particularly relevant for medical education where diagnostic reasoning is, of course,critical.E. Limitations of the Study and Suggestions for Further ResearchSince the limitations of the current research often motivate additional research, thetwo issues are discussed together. This study has several limitations. Subjects werevolunteers who were paid for their participation; they were not randomly selected from thepopulation and they lacked any intrinsic motivation on the experimental tasks. All subjectswere undergraduate students in a Teacher Training Program, which limits anygeneralizations to other populations. Additional research with samples from otherpopulations is clearly needed.Methodological limitations span both the inductive and deductive tasks. Theinductive task was by necessity relatively short, averaging about one hour. This length oftime may not be adequate for inducing a full blown logical rule in the form of75automatized, compiled, procedural knowledge that can be effortlessly applied in differentsettings (Anderson, 1983). Additional research is needed to determine whether moreextensive rule induction training would result in transfer to deductive tasks with unfamiliarcontent. In addition, different inductive training sets need to be tested since there is noreason to assume that the exemplar sets used in this study were in any way exhaustive oroptimal.In order to make the duration of the inductive task reasonable, subjects wereallowed to record their observations, responses, and corresponding feedback from thecomputer during rule learning. This procedure made the task easier but it may not haveencouraged deeper processing because subjects did not have to restructure the informationthey encountered in order to make it more manageable for their working memory. In turn,their ability to reconstruct their conditional knowledge from memory may have beenreduced. Further research comparing this experimental procedure with one that forcessubjects to rely completely on their memory would be informative.The deductive task utilized standard syllogism forms which may not representpractical reasoning situations that generally involve more information in a less structuredformat. Perhaps more importantly, in the standard deductive syllogism, the premisespresent only binary propositions. In other words, each individual proposition contains onlyone term or its negation (p or not p, or q or not q). This may have biased the taskdifficulty in favour of the binary group since, with binary content, a negation implies asmall contrast set which is easier to represent and thus reduces cognitive load. Therefore,future research needs to consider non-binary deductive tasks such as, for example: “GivenIfxl then yl, together with x3 is the case, what follows?”.76The task content may also limit generalizations to different domains. The challengewas to develop task content that was neutral in that it did not cue many pre-existingsemantic associations based on experience, while at the same time providing a sense ofrealism and practical relevance. Every effort has been made to satisfy these conflictingcriteria, but the inevitable tradeoff may have diminished the external validity of the tasks.Future research should investigate transfer within and between “real-world” contentdomains.The method of scoring the deductive performance may also have masked someeffects. It will be recalled that validation scores for affirmative and negated conclusionswere added together to yield an overall performance score. Validation responses aregenerally not identical for these two situations (e.g. Lee, 1985; Marcus & Rips, 1979) and,more importantly, the response patterns may interact with different inductive trainingconditions. Further research is clearly needed, which may also necessitate a differentanalytical approach suitable for the analysis of categorical data.As in other experimental studies in cognitive experimental psychology, thepossibility existed that subjects may not have been fully motivated to engage all theiravailable cognitive resources while working on the tasks. Consequently, someperformance differences may have been attenuated if subjects did not bother to deploysome strategy or knowledge which they may have acquired during the inductive phase.However, since the context under which the tasks were presented (i.e. evaluating studentlearning) was highly relevant to the subjects’ coursework, the vast majority of subjectsappeared highly engaged and genuinely interested in the experimental tasks. Still, futureresearch should investigate inductive and deductive reasoning that is motivated by realisticgoals and has real consequences.77One important issue raised by the present research concerns the boundaries of nearand far-transfer. In this study, the near vs. far distinction was based on the similarity, interms ofcontent semantics, between the acquisition and the application of ruleknowledge. Research in other areas of cognitive psychology on the transfer of cognitiveskill, most notably research on analogical reasoning is relevant here. Studies of analogicalreasoning indicate that transfer is a function of the similarity between the transfer task, andthe reasoner’s internal representation of a related situation encountered previously(Vosniadou & Ortony, 1989). Studies of analogical reasoning have shown that internalrepresentations based on structural features enable far-transfer when these representationsare successfully mapped onto new situations that share the same structural features(Gentner & Toupin, 1986). In this context, the structural features of conditionalrepresentations are the one-to-one and one-to-many mappings between antecedent andconsequent pairs, as illustrated in the model proposed above.Unfortunately, structural mappings unlike surface mappings based on semanticrelatedness, are more difficult to access since they are often less salient. One way tofacilitate access to these features is to provide contextual cues. Another way is toexplicitly focus attention on relations between objects, rather than the objects themselves.In terms of the present research, it would be valuable to investigate transfer of conditionalknowledge using different contextual cues between the inductive and deductive phases. Inthis way, the applicability of even a limited, domain-specific representation may bestretched by cueing the reasoner to focus on the structural features that are presumablealready present in their internal knowledge representation. Using this approach, it wouldbe possible to test deductive transfer to other domains, that may not share any surfacefeatures with task domains encountered during rule acquisition.F. Final Summary and Conclusion78Reasoning with conditional rules represents one of the most essential andubiquitous forms of human reasoning. Despite the proliferation of research on conditionalreasoning, many questions regarding the knowledge representation and the processinginvolved in this form of reasoning have not been fully answered. One reason is that invirtually all studies of conditional reasoning, the existence of conditional knowledge wassimply assumed, and how the assumed knowledge was acquired in the first place has notbeen addressed. In contrast, this study utilized a transfer-of-training paradigm whichmade it possible to actually manipulate the learning and the application of conditionalknowledge, and consequently, to test different hypotheses about its underlying form andfunction. Taken together, the present findings suggest that conditional reasoning is basedon a knowledge representation of intermediate level of abstraction, not unlike that of basiclevel categories. Instead of completely generalized schemas or specific instances, subjectsappear to rely on conditional knowledge that is domain-specific and that contains theminimum number of antecedents and consequents that captures the observed conditionalrelations. This minimum number appears to include the target antecedent-consequent pair(i.e. ci and ci), as well as an abstract representation of the respective logical complements{ not ci, not ci). Further research is needed, however, with a non-binary version of thedeductive task to assess the full extent to which induced conditional knowledge can in factbe generalized.79REFERENCESAnderson, J. A. (1983). The architecture ofcognition. Harvard University Press.Anderson, J.A., & Bower, G.H. (1973). Human associative memory. Hilisdale, N.J:Eribaum.Baddeley, A.D. (1990). Human memory: Theory andpractice. Mass: Allyn & Bacon.Baddeley, A.D. (1986). Working memory. Oxford: Clarendon.Beisley, D.A., Kuh, E., & Welch, R.E. (1980). Regression diagnostics: Iden4fyingInfluential Data and Sources ofCoiinearity. New York: Wiley.Bereiter, C., & Scardamalia, M. (1985). Cognitive coping strategies and the problem of“inert’ knowledge. In S.F. Chipman, J.W. Segal, & R. Glaser (Eds.), Thinking andlearning skills: Current research and open questions V2. Hilisdale, NJ: Erlbaum.Bjorkman, M., & Nilsson,R. (1983). Prediction and diagnosis:An experimentalcomparison. Scandinavian Journal ofPsychology, 23, 17-22.Bourne, L.E. Jr. (1974). An inference model for conceptual rule learning. In R.L. Solso(Ed.), Theories in cognitive psychology: The Loyola Symposium (pp.231-256). NewYork: Lawrence Erlbaum.Bourne, L.E. (1970). Knowing and using concepts. Psychological Review, 77, 546-556.Box, G.E.P., & Cox,D.R. (1964). An analysis of transformations. Journal of the RoyalStatistical Society, 26, 211-243.Braine, M.D.S., & O’Brien, D.P. (1991). A theory of If: A lexical entry, reasoningprogram, and pragmatic principles. Psychological Review, 98, 182-203.Braine, M.D.S., Reiser, B.J., & Rumain, B. (1984). Some empirical justification for atheory of natural propositional logic. In G. H. Bower (Ed.), The psychology oflearning and motivation: Advances in research and thinking (Vol. 18, pp. 317-371).New York: Academic Press.Bruner, J.S., Goodnow, J.J.,& Austin, G.A. (1956). A study of thinking. New York:Wiley.Carlson, R.A., & Schneider, W. (1989). Acquisition context and the use of causal rules.Memory & Cognition, 17, 240-248.80Catrambone, R., & Holyoak, K. J. (1989). Overcoming contextual limitations on problemsolving transfer. Journal ofExperimental Psychology: Learning, Memory, andCognition, 15, 1147-1156.Cheng, P.W., & Novick, L.R. (1992). Covariation in natural induction. PsychologicalReview, 99, 365-382.Cheng, P.W., Holyoak,K.J, Nisbett,R.E., & Oliver,L.M. (1986). Pragmatic versussyntactic approaches to training deductive reasoning. Cognitive Psychology, 18, 293-328.Cheng, P.W., Holyoak,K.J. (1985). Pragmatic reasoning schemas. Cognitive Psychology,17, 391-416.Collett, D. (1991). Modeling binary data. Chapman & Hall.Cummins, D.D., Lubart,T., Alksnis, 0., & Rist, R. (1991). Conditional reasoning andcausation. Memory & Cognition,19, 274-282.Evans, J. St.B . T. (1993). The mental model theory of conditional reasoning: criticalappraisal and revision. Cognition, 48, 1-20.Evans, J.St.B.T. (1982). The psychology ofdeductive reasoning. London: Routledge &Kegan Paul.Evans, J.St.B.T. (1984). Heuristic and analytic processes in reasoning. British Journal ofPsychology, 75, 45 1-468.Evans, J.St.B.T., Barston, J.L., & Pollard, P. (1983). On the conflict between logic andbelief in syllogistic reasoning. Memory & Cognition, 11, 295-3 06.Evans, J.St.B.T., & Beck, M.A. (1981). Directionality and temporal factors in conditionalreasoning. Current Psychological Research, 1, 111-120.Gentner, D., & Toupin, C. (1986). Systematicity and surface similarity in the developmentof analogy. Cognitive Science, 10, 277-300.Gick, M.L., & Holyoak, K.L. (1987). The cognitive basis of knowledge transfer. In S.Cormier & J. Hagman (Eds.), Transfer of learning. San Diego, CA: Academic Press.Griggs, R.A., & Cox, J.R. (1982). The elusive thematic materials effect in Wason’sselection task. British Journal ofPsychology, 73, 403-420.81Haygood, R.C., & Bourne, L.E. (1965). Attribute and rule learning aspects of conceptualbehavior. Psychological Review, 72, 175-195.Hinton, G., & Anderson, J.A. (1981). Parallel models ofassociative memory. Hillsdale,NJ: Eribaum.Holland, J.H., Holyoak, K.J., Nisbett, R.E., & Thagard, P.R. (1987). Induction: Processesof inference, learning, and discovery. MIT Press.Holyoak, K.J., & Koh, K. (1987). Surface and structural similarity in analogical transfer.Memory & Cognition, 15, 332-340.Huberty, C.J., & Smith, J.D. (1982). The Study of Effects in MANOVA. MultivariateBehavioural Research, 17, 417-432.Inhelder, B., & Piaget, J. (1958). The growth of logical thinking. New York: BasicBooks.Jackson, S.L., & Griggs, R.A. (1990). The elusive pragmatic reasoning schemas effect.The Quarterly Journal ofExperimental Psychology, 42A, 353-373.Johnson-Laird,P.N., Byrne, R.M.J.,& Schaeken, W. (1994). Why models rather than rulesgive a better account of propositional reasoning: A reply to Bonatti and O”Brien,Braine, and Yang. Psychological Review, 101, 734-739.Johnson-Laird, P.N., Byrne, R.M.J.,& Schaeken, W. (1992). Propositional reasoning bymodel. Psychological Review, 99, 418-439.Lee, S.S., & Lee, Y-H.K. (1992). STMprocessing capacity and rule schema inpreadolescents’ and adults’ conditional reasoning. Unpublished Manuscript.Lee, S.S. (1985). Children’s acquisition of conditional logic structure:Teachable?Contemporary Educational Psychology, 10, 14-27.Lee, S. S. (1984). Separability of coded attributes in logical rule acquisition: A two-process analysis. Canadian Journal ofPsychology, 38, 386-401.Marcus, S .L., & Rips, L.J. (1979). Conditional reasoning. Journal of Verbal Learning andVerbal Behavior, 18, 199-223.Marcus, S .L., & Rips, L.J. (1979). Conditional reasoning. Journal of Verbal Learning andverbal Behavior, 18, 199-223.Markovits, H. (1984). Awareness of the possible as a mediator of formal thinking inconditional reasoning problems. British Journal ofPsychology, 75, 377-391.82Markovits, H., & Nantel, G. (1989). The belief-bias effect in the production andevaluation of logical conclusions. Memory & Cognition, 17, 11-17.Markovits, H. (1985). Incorrect conditional reasoning among adults: competence orperformance? British Journal ofPsychology, 76, 241-247.Markovits, H. (1986). Familiarity effects in conditional reasoning. Journal ofEducationalPsychology, 78, 492-494.Markovits, H. (1988). Conditional reasoning,representation,and empirical evidence on aconcrete task. The Quarterly Journal ofExperimental Psychology, 40A, 483-495.Morris, C.D., Bransford, J.D., & Franks, J.J. (1977). Levels of processing versus transferappropriate processing. Journal of Verbal Learning and Verbal Behavior, 16 519-533.Newell, A., & Simon, H.A. (1972). Human problem solving. Englewood Cliffs, NJ:Prentice Hall.Nisbett, R.E. (1993). Rulesfor reasoning. Hilisdale, N.J.: Lawrence Erlbaum.Oaksford, M., & Stenning, K. (1992). Reasoning with conditionals containing negatedconstituents. Journal ofExperimental Psychology: Learning, Memory, andCognition, 18, 83 5-854.Piburn, M.D. (1990). Reasoning about logical propositions and success in science.Journal ofResearch in Science Teaching, 27, 887-900.Pollard, P., & Evans, J.St.B.T. (1981). The effects of prior beliefs in reasoning: Anassociational interpretation. British Journal ofPsychology, 72, 73-81.Pollard, P., & Evans, J.St.B.T. (1983). The effect of experimentally contrived experienceon reasoning performance. Psychological Research, 45, 287-301.Pollard, P., & Evans, J.St.B.T. (1987). Content and context effects in reasoning.American Journal ofPsychology, 100, 41-60.Popper, K.R. (1959). The logic ofscient/Ic discovery. New York:Harper & Row.Rips, L.J. (1994). The psychology ofproof Cambridge, Mass: MIT Press.Rips, L.J. (1990). Reasoning. Annual Review ofPsychology, 41, 321-353.83Ross, B.H., & Bower, G.H. (1981). Comparison of models of associative recall. Memory& Cognition, 9, 1-16.Royer, J.M. (1979). Theories of the transfer of learning. Educational Psychologist, 14,53-69.Singley, M.K., & Anderson, J. A. (1989). The transfer ofcognitive skilL HarvardUniversity Press.Smith, E.E., Langston, C., & Nisbett, R.E. (1992). The case for rules in reasoning.Cognitive Science, 16, 1-40.Sternberg, R.J. (1986). Toward a unified theory of human reasoning. Intelligence, 10,281-314.Tulving, E. (1984). Precis of elements of episodic memory. The Behavioral and BrainSciences, 7, 223-268.Vosniadou, S., & Ortony, A. (1989). Similarity and analogical reasoning. CambridgeUniversity Press.Waidman, M.R., & Holyoak, K.J. (1992). Predictive and diagnostic learning within causalmodels: Asymmetries in cue competition. Journal ofExperimental Psychology:General, 121, 222-236.Ward, S.L., Byrnes, J.P., & Overton,, WF. (1990). Organization of knowledge andconditional reasoning. Journal ofEducational Psychology, 82, 832-837.Wason, P.C. (1966). Reasoning. In B.M.Foss (Ed.). New horizons in psychology.Harmondsworth: Penguin.Wason, P.C.,& Shapiro, D. (1971). Natural and contrived experience in a reasoningproblem. Quarterly Journal ofExperimental Psychology, 23, 63-71.Woodworth, R.S., & Sells, S.B. (1935). An atmosphere effect in syllogistic reasoning.Journal ofExperimental Psychology, 18, 451-460.84APPENDICES85APPENDIX A: An illustration of rule transfer based on the proposed hypothesesTo provide a more concrete picture of the hypothetical steps involved in thetransfer of an acquired rule according to the instance-based account of reasoning, thissection briefly, and informally, illustrates how a conditional rule might be learned, and howit might subsequently be used in making deductive inferences.Phase I. Acquiring a Conditional Rule (Inductive Reasoning):-Let’s assume that aresearcher wants to discover what effect, if any, certain environmental agents have on thehealth of people who have been exposed to these agents. Let’s take a simple examplewhere she wants to investigate only two such agents known as causal agent ci and causalagent c2. The effects she is considering are health effect labeled el, and no health effect,labeled e2. Let’s assume that this researcher administers each agent and observes the effectover several trials. Here is what she might observe if the agents and heath effects have aconditional (asymmetrical) relational structure:ci -->elc2-->elc2 --> e2After running a number of these trials, her memory for these observations might berepresented as simple associations between the causal agents and their health effects:ci >>> elc2 >>> e2ci >>> cic2 >>> elNote that in this notation the symbol “>>>“ indicates the strength and direction ofthe association between the cause and the effect as they are represented in memory.Her next task is to conclude something from the data. Clearly, a number ofdifferent conclusions, or propositional statements emerge from the results:86a. “Agent ci causes a health effect el”b. “Agent c2 causes a health effect el on some trials but no effect on others”c. “Agent ci never results in the absence of a health effect”, etc.Given that her goal is to come up with some general, concise statement that summarizesthe results (i.e. a general conclusion), she notes that no general conclusion can be statedabout agent c2. In fact the most appropriate general and concise statement summarizingthe results is:”Whenever agent ci is administered, effect el occurs” or, “IF ci THEN el”.Because a statement in this general form is in effect induced from the set of observationsrepresented in memory,, it can serve as a pointer to her memory representation for that setof observations.In addition, the researcher’s search for a general rule (conclusion), and her attemptto formulate her conclusion in a propositional form, means that the ci >>> ci pair hasbeen processed more elaborately (more deeply) then the other pairs. This means that theassociative link from ci to e I has actually been strengthened by the application of thepragmatic constraints.87Phase II. Transfer of the Acquired Rule to Conditional Reasoning:-For the sake of brevity,the assumed reasoning process is illustrated only in the case of the DA inference form.Step 1. Representation of the First Premise:-According to the instance view, the firstpremise (the acquired rule) provides lexical access to the stored representation of theinstances from which the rule was induced:“IFclTHENel” > cl>>>elc2 >>elc2 >>e2Step 2. Representation of the Second Premise:-The negation of the antecedent in thesecond premise activates the set of antecedents in the first premise (since both refer to thesame category of causes). To accommodate a consistent internal representation of bothpremises, the inference schemaP or Qnot PQis instantiated yielding the following unified representation of both premises:c2>> elc2>> e2Step 3. Generating a Conclusion:-Following the association links from the antecedents tothe consequents, our reasoner arrives at the conclusion that either el or e2 occurs. Inother words the reasoner concludes that ci sometimes occurs, meaning that theconclusion “ci” is sometimes true which is the correct conclusion.A complete account of the hypothesized reasoning process with rules acquiredunder different conditions is given below.88CD100IMN)—IVVCDIVVN)IVVICDCDF-ICDCDCDI1‘<CD c2-000N)N)I-VVVVVVVVVCDCDVN)HV CD00C-)0ICDCD—IVV01VVMI00IMF-0ICD(DCDI—iI—IVVV0IVVVN)1000—IWN)H0 W 0ICDCDCD—IVVV0IVVVN)IVVV-10000IW0I(DCDl-I—IVV01VVN)IVV100IN)I-0I(DMIN)!IVIV I0IN)0I(D(D(D(DN)IWWN)N)—IVVVV01VVVVW10000IWN)WN)0ICD(DCDCDN)I()WN)N)—IVVVV01VVVVWIVVVV10000IWN)WN)0I(DN)IN)IV IV IV I0IN)F-h CD II CD rt CD p.) II 0 0 rF H 0 CD(Dl(DlCDIWICDICDICDIWI000WWWVVVVVVVVVCD(DCDWMF1000WWWAAAAAAAAACDCDCDWN)F1000N)MN)VVVVVVVVV(DCDCDWN)F1000N)N)N)VVVVVVVVVCD(DCDWMF1000MN)N)AAAAAAAAACDCDCDWN)F1CDI0l—1IF IV IV IV IV IV ICD,IHCD10IV IV IV IV IV lCD(D10IV IV IV lCDIF1CDI0 IV IV IV lCDIF10 V V V V V CD000WWWVVVVVVVVVCDCD(DWMF1000WWWAAAAAAAAA(DCD(DWN)I—F-I CD p.) CD 0 H ct p.) 0 0 F rt F P.) F1 I-I F-1 CD p.) 0 F F-I CD CD I-I 0 i F-II-I CD w I-i‘-I CD F-I H p.) ‘- F-I H F- Di w H0 F- CD rF H 0 0 H rt F- 0CD CD CD II CD ci I-I CD CD CD CtP.) rF 0 p.) 0- F-I 0 0 CD Cl) Cl) H 0 CD 0 H 0 CD 01 0 Ct H CD0000N)N)N)F1 AAAVAAAVAAAVCDCDCD(DCD100I1IMN)IAACDIAAN)IAAICDCD00MN)AAAACDCD0 V V V CD I-IAPPENDIX B: Instructions to subjectsTeacher Education StudyInstruction BookletStep 1. Please read the following section:In this study we are interested in learning more about how teachers evaluate their students’learning. Specifically, we are exploring how teachers assess the way their students applynewly acquired knowledge about causes and effects in a variety of content areas.The study involves two separate parts:Part A. Discovery Learning No. 1Discovery Learning No. 2Part B. Evaluating Students: Situation No.1Evaluating Students: Situation No.2Evaluating Students: Situation No.3Evaluating Students: Situation No.4The purpose of the discovery learning part is to let new teachers like yourself go throughthe same learning experience that some students have actually gone through.Generally, people take less than an hour to complete the entire study. You should expectPart A to take a little longer than part B. The evaluation situations in Part B are all veryshort. So sit back, relax, and have fun!Step 2. Please boot-up the computer, inset the disk into drive A (or B) and do thefollowing:i. Make drive A (or B) the active directoryby entering the command a: (or B:)ii. When you see the A> prompt, start the program by enteringthe command goiii. Follow the directions on the screenand please, type everything in lower case only.90Part A. Discovery Learning: Predicting Effects of Radiation on PlantsStep 1. Please read the following section carefully:The topic we will deal with in this part involves predicting the effects of different types ofradiation on plants. A variety of plants were exposed to several radiation sources bybiologists in different countries. Each group ofbiologists made a specific predictionconcerning the effect of their particular type of radiation on their plants. You will see eachprediction on the screen shortly.When you begin the next computer session, you will look at each prediction beingdisplayed on the screen, and decide if the prediction is correct or not. After each response,you will receive feedback which you can use to make a more informed decision next time.The goal is to discover a simple rule which will help you decide whether each predictionis correct or not. The program will stop after you make 18 correct decisions in a row.Step 2. Please type the lower case letter m to begin the next computer session...You are encouraged to record relevant information about each diagnosis in the spacebelow:Type of Radiation Predicted Effect Prediction Correct?(yin)91Step 1. Please answer the following question:How were you able to tell whether a particular prediction was correct or not?Please write your response on the line below:Step 2. Please turn to the next page....92Step 1. Please answer the following question:Based on what you have observed in the set of examples just presented, which of thefollowing statements represents a CONCISE, GENERAL rule that BEST SUMS UP therelationship between the type of radiation and the type of mutation?Please circle your best choice, and put an X beside your second best choice, in case youhave one.1. If a plant is exposed to Beta radiation, then it grows either square, triangular, orcircular leaves.2. If a plant is exposed to Gamma radiation, then it grows square leaves.3. If a plant is exposed to Gamma radiation then it grows square leaves, and if a plantgrows square leaves then it was exposed to Gamma radiation.4. If a plant grows square leaves then it was exposed to Gamma radiation.Step 2. Please turn to the next page....93Part A. Discovery Learning: Diagnosing Causes of Strange Fish ScalesStep 1. Please read the following section careflilly:The topic we will deal with in this part involves diagnosing the causes of unusual scales offish that swim in polluted waters. The fish were diagnosed by fisheries officials in manydifferent locations. Each group of fisheries officials made a specific diagnosis concerningthe cause of the particular appearance of fish scales they observed at their location. Youwill see each diagnosis on the screen shortly.When you begin the next computer session, you will look at each diagnosis beingdisplayed on the screen, and decide if the diagnosis is correct or not. After each response,you will receive feedback which you can use to make a more informed decision next time.The goal is to discover a simple rule which will help you decide whether each diagnosisis correct or not The program will stop after you make 18 correct decisions in a row.Step 2. Please type the lower case letter y to begin the next computer session...You are encouraged to record relevant information about each diagnosis in the spacebelow:Appearance of Scales Diagnosed Cause Diagnosis Correct?(yin)94Step 1. Please answer the following question:How were you able to tell whether a particular diagnosis was correct or not?Please write your response on the line below:Step 2. Please turn to the next page....95Step 1. Please answer the following question:Based on what you have observed in the set of examples just presented, which of thefollowing statements represents a CONCISE, GENERAL rule that BEST SUMS UP therelationship between the type of chemical agent and the type of fish scales?Please circle your best choice, and put an X beside your second best choice, in case youhave one.1. If fish have brittle scales then agent Tx was released in the water.2. If agent Tx is released then fish develop brittle scales, and if fish have brittle scalesthen agent Tx was released into the water.3. If agent Tx is released in the water, then fish develop brittle scales.4. If agent Ty is released in the water, then fish develop either enlarged, brittle, orloose scales.Step 2. Please turn to the next page....96Evaluating Students: Situation No. 1Step 1. Please read the following section:The students you are about to evaluate went through a discovery learning exercise thatwas identical to the one you went through. It was the one involving radiation sourcesand plant mutations.These students observed the same set ofexamples you have seen, and from theseexamples they discovered a simple rule about a type of radiation and a type of plantmutation. The rule they discovered happens to be the correct rule (You will see this rulewhen you begin to work on the computer again.)The students used this correct rule along with some additional information to arrive atvarious conclusions. Your task is to evaluate each student’s conclusion. That is, you are todecide whether the conclusion a particular student arrived at is:1. Always True, or2. Sometimes True or Sometimes False, or3. Always False.There are eight students that you will evaluate.Step 2. Please type the lower case letter c to begin your evaluation...97Evaluating Students: Situation No. 2Step 1. Please read the following section:The students you are about to evaluate went through a discovery learning exercise thatwas identical to the one you went through. It was the one involving chemical agents andfish scales.These students observed the same set ofexamples you have seen, and from theseexamples they discovered a simple rule about a type of chemical agent and a type of fishscales. The rule they discovered happens to be the correct rule (You will see this rulewhen you begin to work on the computer again.)The students used this correct rule along with some additional information to arrive atvarious conclusions. Your task is to evaluate each student’s conclusion. That is, you are todecide whether the conclusion a particular student arrived at is:1. Always True, or2. Sometimes True or Sometimes False, or3. Always False.There are eight students that you will evaluate.Step 2. Please type the lower case letter o to begin your evaluation...98Evaluating Students: Situation No. 3Step 1. Please read the following section:The students you are about to evaluate were learning about new drugs and strange skinrashes. This material is completely different from any of the learning experiences you havegone through here, but we are still interested in your assessment of these students.These students discovered a relationship between a type of drug and a type of skin rash.They stated this relationship in the form of a rule which you will see shortly when youstart on the computer again.Even though you are not familiar with this content area, please suppose that the rule theydiscovered is correct.The students used this supposedly correct rule along with some additional information toarrive at various conclusions. Your task is to evaluate each student’s conclusion. That is,you are to decide whether the conclusion a particular student arrived at is:1. Always True, or2. Sometimes True or Sometimes False, or3. Always False.There are eight students that you will evaluate.Step 2. Please type the lower case letter d to begin your evaluation...99Evaluating Students: Situation No. 4Step 1. Please read the following section:The students you are about to evaluate were learning about various types of lasers used ineye surgery. This material is completely different from any of the learning experiences youhave gone through here, but we are still interested in your assessment of these students.These students discovered a relationship between a type of surgical laser and some strangespots on the retina of the eye. They stated this relationship in the form of a rule which youwill see shortly when you start on the computer again.Even though you are not familiar with this content area, please suppose that the rule theydiscovered is correct.The students used this supposedly correct rule along with some additional information toarrive at various conclusions.Your task is to evaluate each student’s conclusion. That is,you are to decide whether the conclusion a particular student arrived at is:1. Always True, or2. Sometimes True or Sometimes False, or3. Always False.There are eight students that you will evaluate.Step 2. Please type the lower case letter e to begin your evaluation...100Finally, just a few (easy) questions:1. Age____2. Gender M F3. What is your educational background?B.A. in____________B.Sc.in________________M.A. in___________ _Ph.D.in_Other4. What program of studies are you currently registered in?Major_MinorYear5. Throughout your university career, approximately how many courses did you take ineach of the following, general areas?MathBiologyChemistry____PhysicsPsychologyEvaluation in teachingPhilosophyArt__Special EdThat’s it! Thanks for your time and effort. It is highly appreciated.Please return the disk and the booklet to collect your honorarium.101


Citation Scheme:


Citations by CSL (citeproc-js)

Usage Statistics



Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            async >
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:


Related Items