UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Expert systems in case-based law : the rule against hearsay Blackman, Susan Jane 1988-08-25

You don't seem to have a PDF reader installed, try download the pdf

Item Metadata

Download

Media
831-UBC_1988_A6_4 B53.pdf [ 7.16MB ]
Metadata
JSON: 831-1.0077711.json
JSON-LD: 831-1.0077711-ld.json
RDF/XML (Pretty): 831-1.0077711-rdf.xml
RDF/JSON: 831-1.0077711-rdf.json
Turtle: 831-1.0077711-turtle.txt
N-Triples: 831-1.0077711-rdf-ntriples.txt
Original Record: 831-1.0077711-source.json
Full Text
831-1.0077711-fulltext.txt
Citation
831-1.0077711.ris

Full Text

EXPERT SYSTEMS IN CASE-BASED LAW: THE RULE AGAINST HEARSAY By SUSAN JANE BLACKMAN B.Sc, The University of Waterloo, 1978 LL.B., The University of Calgary, 1986 A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF LAWS in THE FACULTY OF GRADUATE STUDIES (Faculty of Law) We accept this thesis as conforming to the required standard THE UNIVERSITY OF BRITISH COLUMBIA October 1988 @S. J. Blackman, 1988 In presenting this thesis in partial fulfilment of the requirements for an advanced degree at the University of British Columbia, 1 agree that the Library shall make it freely available for reference and study. I further agree that permission for extensive copying of this thesis for scholarly purposes may be granted by the head of my department or by his or her representatives. It is understood that copying or publication of this thesis for financial gain shall not be allowed without my written permission. Department of LAW FACULTY The University of British Columbia Vancouver, Canada Date October 14, 1988 DE-6 (2/88) ii ABSTRACT The rule against hearsay of evidence law, and its exceptions, can be explained with a simple heuristic device. Where the circumstances surrounding the making of the hearsay statement indicate that the declarant perceived the matters reported accurately, believed and remembered what she saw when she reported it, and intended to accurately report it, the evidence appears reliable and is admissible in court. This theory is used as the basis for building an expert system to advise lawyers about admissibility of hearsay evidence. The expert whose knowledge forms the basis of this expert system is Professor M. T. MacCrimmon of the Faculty of Law at the University of British Columbia. iii TABLE OF CONTENTS Page Abstract ii List of Tables v List of Figures vAcknowledgements vii Chapter 1 INTRODUCTION TO EXPERT SYSTEMS IN LAW 1 1. Expert Systems 1 2. Algorithms and Heuristics 5 3. Basic Questions in the Hearsay Rule Expert System Research 8 4. Jurisprudential Theories and their Use for Legal Expert SystemsChapter 2 BUILDING THE HEARSAY RULE ADVISOR 15 1. Technical Aspects 12. Personnel 6 3. The FLEX Methodology 14. Interviewing the Expert 7 5. Case Research 22 a) Scope of researchb) Master case lists 23 c) Key questionsd) Selection of test cases 24 e) Matrix 2f) Revision 7 g) The databaseh) Problems 28 IV Chapter 3 THE HEARSAY RULE ADVISOR 29 1. Theory of the Hearsay Rule Advisor2. Sample Consultations 46 a) Example - definition and dying declarationb) Example - business documents 50 Chapter 4 SELECTED ASPECTS OF EXPERT SYSTEMS IN LAW 53 1. Nervous Shock Advisor and Hearsay Rule Advisor Compared 53 2. Legal Expert Systems and Legal Theory 55 3. Case-Based Law and Statutes 66 4. Legal Expertise and the Expert 9 5. The Knowledge Engineer 73 6. The Inference Engine 5 7. The User Interface 77 8. Databases 9 9. Evaluation of the Expert System 81 Chapter 5 CONCLUSION 82 1. Review2. Overview of Expert Systems in Case-Based Law 83 3. Issues for Future Consideration 84 NOTES 87 BIBLIOGRAPHY 99 GLOSSARY 105 APPENDIX A Structure of the Hearsay Rule Advisor 108 APPENDIX B Knowledge Base of the Hearsay Rule Advisor 117 LIST OF TABLES page Table 1 - Questions and Cases to Determine Settled Hopeless Expectation of Death for the Dying Declaration Exception 25 Table 1 Continued 26 vi LIST OF FIGURES page Chapter 3 THE HEARSAY RULE ADVISOR Fig. 1. Overview of the Hearsay Rule Advisor 31 Fig. 2. Preliminary Questions and Definition of Hearsay 32 Fig. 3. Dying Declarations Exception 33 Fig. 4. Declarations Against Interest Exception 34 Fig. 5. Business Documents Exception 35 Fig. 6. Declarations in Course of Duty Exception 36 APPENDIX A Structure of the Hearsay Rule Advisor Fig. A-l. Overview of the Hearsay Rule Advisor 109 Fig. A-2. Definition of Hearsay 110 Fig. A-3. Dying Declarations - event, perceive, believe 111 Fig. A-4. Dying Declarations - intend 112 Fig. A-5. Declarations Against Interest - event, perceive 113 Fig. A-6. Declarations Against Interest - believe, intend 114 Fig. A-7. Business Documents 5 Fig. A-8. Declarations in Course of Duty 116 ACKNOWLEDGEMENTS Many thanks are due to Professor Marilyn MacCrimmon for the great effort she has put into the expert system, for helpful discussion of topics covered in this thesis, and for her unfailing enthusiasm. Thanks are also due to the following people: Professor J. C. Smith and Cal Deedman for answering so many questions and for their encouragement, and Susan Burak for all her work on the database and her solid support. Last but not least, many thanks to my husband, Grahame Loader, for his careful work on the diagrams and expecially for his patience and encouragement during a difficult year. 1 Chapter 1 INTRODUCTION TO EXPERT SYSTEMS IN LAW 1. Expert Systems An expert system is a computer program that provides answers to its users' problems in certain fields of knowledge. The system consists of rules embodying experts' knowledge and the software that processes these rules. It is thought that human experts generally do not solve problems by delving into the mass of knowledge pertaining to their area of expertise. On the contrary, they access a relatively small number of heuristics and specific domain theories which they themselves have developed and learned over years of experience, and use first principles only to solve particularly difficult problems. Most experts ... begin by studying their specialty in school. They acquire a knowledge of the first principles and general theories that are regarded as basic to their profession. Then they begin to practice their profession. If they are lucky, they have a mentor who helps orient them to the specific practices of their profession. In any case, they gain experience, and, in the process, they recompile what they know. They move from a descriptive view of their profession to a procedural view. If they succeed in becoming an expert, they have rearranged the knowledge in long-term memory so that they can respond to problem situations by using heuristics and specific domain theories. Practicing experts hardly ever explain their recommendations in terms of first principles or general theories. If they encounter unusual or complex problems, however, they will return to first principles to develop an appropriate strategy.1 Where programming a computer to have full understanding of all the knowledge of an expert's field is a monstrous task which would take years, programming it using only the expert's heuristic rules is a much more manageable and realistic project. Characteristic of expert systems is that they are narrow and deep. That is, rather than covering a large area of knowledge, they concentrate on small domains and explore them in great detail. Without such depth, these systems could not perform at any level of expertise. Also, because they must be deep, it becomes impractical to cover a broad area in a single system. 2 The term "expert system" has been defined by a pioneer and leader in expert systems research, E.A. Feigenbaum, as: ... an intelligent computer program that uses knowledge and inference procedures to solve problems that are difficult enough to require significant human expertise for their solution. Knowledge necessary to perform at such a level, plus the inference procedures used, can be thought of as a model of the expertise of the best practitioners of the field.2 Negoita defines expert systems more simply as "... software systems that mimic the deductive or inductive reasoning of a human expert ..."3 Richard Susskind defines them as: ... computer programs that have been constructed (with the assistance of human experts) in such a way that they are capable of functioning at the standard of (and sometimes even at a higher standard than) human experts in given fields. They are used as high-level intellectual aids to their users ...4 Central to expert systems is the notion of heuristic rules. A heuristic is "[a] rule of thumb or other device or simplification that reduces or limits search in large problem spaces."5 Heuristic rules capture the heuristics the expert uses to solve a problem. "The power of a knowledge system reflects the heuristic rules in the knowledge base."6 Susskind refers to heuristics as "... unpublished, experiential, informal, judgmental, and often procedural, rules of thumb ..."7 Negoita refers to heuristics simply as "rules of good judgment."8 As an example of a legal heuristic device, Professor Lawrence Tribe describes a testimonial triangle which he developed for analysis of hearsay evidence.9 With this device, the exceptions to the rule against hearsay can be shown to possess certain attributes of reliability that other hearsay evidence lacks. The knowledge base of an expert system consists of the facts and rules - the knowledge - which embody a specific area of expertise called the domain.10 The knowledge base also makes use of meta-knowledge, that is, knowledge about knowledge, as well as meta-rules, rules which control the use of rules. The inference 3 engine is the computer software which can interact with a user both to ask for information and to give conclusions, and which can search through the knowledge base to generate those conclusions.11 The domain expert is the person whose knowledge becomes encoded in the rules that make up the knowledge base. The knowledge engineer is the person who actually builds the system by debriefing the expert, organizing the knowledge, and then implementing it in a computer program. Legal expert systems "... process and provide knowledge of the law which ... is, in part, the consequence of previous intellectual operations by human beings upon legal data ,.."12 According to Susskind, existing efforts to build expert systems in law "... have yielded far fewer positive results than comparable efforts in other disciplines."13 It is not clear what is meant by "positive results," but perhaps it means that these systems have not been completed or, alternatively, not become accepted and used by the legal profession.14 It is one aim of all legal expert system builders to see their systems well received by the legal profession.15 Susskind feels that it is intuitively obvious that the lack of success in the legal profession, as opposed to other fields of knowledge, ... stems from the differences between the nature of legal reasoning and the nature of other enterprises such as diagnosing illnesses, mineral prospecting, and inferring chemical structures. The latter, we generally agree, are rooted ultimately, in the empirically based, causal, descriptive laws of the natural sciences, whereas legal reasoning involves the manipulation of the prescriptive laws of the legal order, discoverable, in the main, not from uniformities or patterns in the external world but through scrutiny of the formal sources of the law.16 Also, the coverage of legal domains tends to be shallow, the systems emphasize statute law, and the "... entities thought to be necessary and sufficient for legal problem-solving have not been identified."17 Another writer suggests that the lack of success of legal expert systems is simply that they do not provide lawyers with the information they need. "It is as though they are designed to provide for a 4 computer perception of what the Law ought to require, that is for some legal system that does not actually exist."18 This is probably because insufficient attention has been paid to legal theory. A further problem with these systems is that all their rules have to be provided in advance, thus, soon after a prototype system is implemented, its knowledge base must be updated as the law changes.19 One goal of the expert systems being built in the Faculty of Law at the University of British Columbia is to solve problems of substantive law in response to facts of cases provided by the user. An expert system of this type performs as a legal research tool of practical use to lawyers. The research also has another goal which is more important and that is to test theories about the law.20 This includes not only theories of the nature of law and legal reasoning but also more specific theories about particular areas of the law. The latter of necessity requires the former. The theory can be implemented in an expert system and tested against decided cases. ... [C]omputationally-minded lawyers are interested in formalizing law, both as a way of clarifying its intrinsic nature, revealing flaws, and as a way of making law more accessible to future attempts at automating, (citation omitted)21 A related purpose of legal expert systems research is to increase the appearance rationality of legal argumentation and, ultimately, legal decision-making. "...[A] decision is rational if it is so satisfying, that it seems not or hardly to be vulnerable from a legal point of view."22 Until recently, there was little written about expert systems in law from the legal point of view, as opposed to the computational point of view. Anne v. d. L. Gardner's 1987 book, An Artificial Intelligence Approach to Legal Reasoning, contains some discussion of legal theory but focusses more on the computational aspects of applying principles of artificial intelligence to law.23 Susskind's book, Expert Systems in Law, also published in 1987, is the first to explore thoroughly legal theory 5 and legal reasoning with the aim of setting out a method for building expert systems in law which is supportable from a legal point of view. The discussion of legal theory, as applied to expert systems in law found in Susskind's book, is particularly needed in a field where computer science has dominated legal expert systems without sufficient input from law. 2. Algorithms and Heuristics In expert systems work, there are two possible approaches to solving problems, namely the algorithmic approach and the heuristic approach. An algorithm is "[a] systematic procedure that, if followed, guarantees a correct outcome."24 The only way to guarantee a correct outcome is to consider and account for all the knowledge included in the domain to which the problem belongs. The computer then must be programmed so that it possesses all this knowledge together with the meta-knowledge needed to process it. As an example, consider a system which could make guaranteed correct diagnoses of bacterial infections of the blood. This system would have to include all the theory behind medical diagnosis plus all the existing knowledge of bacterial infections of the blood. The algorithmic approach seems fine in theory, but in practice it would take years to build such a system. An algorithm which solved legal problems in a particular domain would contain all the legal rules that might affect the solution, both statutory and case-based. It would have to include legal meta-knowledge about how to interpret the statute to derive the rule or how to obtain the ratio decidendi from a case. It would have to know how to apply the rules to the facts of the user's case. Also, it would have to know how to predict whether a judge has discretion in any given case and how that discretion is likely to be exercised, or how the judge might otherwise decide a case which does not clearly fall under the rules. This would be a 6 monumental task. Indeed, since legal theorists cannot agree on the proper means of describing a legal rule nor on whether a judge has any discretion and how it should be exercised, a proper algorithmic approach is impossible to produce. There is an additional problem in that legal data is generally described and interpreted through imprecise language. Susskind suggests that legal knowledge is something promulgated, perhaps even created, by human beings.25 Scientific knowledge, on the other hand, has an objective appearance and tends to be thought of as a discovery of reality, rather than as a creation of reality. This may be the result of the precision of universally understood scientific formulae and terms. This appearance of objectivity hides the fact that science also involves interpretation and description. Nevertheless, law does have a degree of inherent uncertainty since it cannot be measured objectively in the way that scientific phenomena can. Moreover, words are a highly imprecise means of describing data compared to the numbers and measurements commonly used for scientific data. "A word is not a crystal, transparent and unchanged; it is the skin of a living thought ..."26 This means that an algorithmic approach to law would never generate certainty as could an algorithmic approach to science (at least in theory). Thus, as long as the computer cannot process natural language, it will not be able to produce a guaranteed correct solution to a legal problem even if an algorithmic approach is used. The alternative approach to the algorithmic method is the heuristic approach. As mentioned above, a heuristic is a rule-of-thumb, a simplification or a rule of good judgment. Webster's Third New International Dictionary defines a heuristic as: providing aid or direction in the solution of a problem but otherwise unjustified or incapable of justification ... [specifically] of or relating to exploratory problem-solving techniques that utilize self-educating techniques (as the evaluation of feedback) to improve performance ...27 7 One writer characterizes legal reasoning as being essentially heuristic as opposed to algorithmic. Law makes use of a naive way of thinking often called heuristic reasoning. The future development of legal research and retrieval and of Law itself depends upon the possibility of rationalising heuristics.28 In comparison, legal theorists usually consider legal reasoning to be reasoning by analogy on a case-by-case basis, or reasoning using deductive logic. In particular, it is a matter of continuing debate as to what role deductive logic plays in legal reasoning.29 Possibly, case-by-case reasoning can be likened to heuristic (or "naive") reasoning, while deduction can be likened to algorithmic (or "rational") reasoning. Regardless of varying definitions of the word heuristic, the term "heuristic rule" is used in a narrow sense both in the pioneering work on MYCIN, a medical diagnosis system,30 and by Susskind in the Oxford Project on Scottish Divorce Law.31 MYCIN contains two kinds of rules. The first are ordinary rules containing knowledge about the medical domain. The second are rules about these rules - that is, meta-rules. They "contain information about rules and embody strategies for selecting potentially useful paths of reasoning."32 An example is the following: "If the patient has had a bowel tumor, then in concluding about organism identity, rules that mention the gastrointestinal tract are more likely to be useful."33 This one heuristic rule substitutes for a great deal of knowledge which, in an algorithmic approach, would have explained why this kind of rule is likely to be more useful. As in the MYCIN system, the Oxford Project defines legal heuristics as rules to control search strategies.34 The heuristic approach does not guarantee a correct solution to a problem but it will take much less time to build. Therefore, it is by far the more practical of the two. 8 3. Basic Questions in the Hearsay Rule Expert System Research This thesis is designed to answer two general questions: a) Is there a single unifying theory which can explain the exceptions to the hearsay rule, including hard cases? If so, what is it? b) Can this theory be used as the basis for an expert system that will predict whether hearsay evidence will be admissible in court? If so, how? Previous work at the University of British Columbia by Cal Deedman and Professor J.C. Smith of the Law Faculty produced the Nervous Shock Advisor (NSA) - an expert system dealing with case law in the area of nervous shock.35 This work showed that rule-based expert systems in case-based law can be built to solve both clear and hard cases, but not on the basis of traditional legal theory. One reason for repeating the exercise using a different domain expert, a different knowledge engineer, and a different area of law was to confirm that the earlier success was not unique. The differences between the NSA and the Hearsay Rule Advisor (HRA) will be discussed below. Other existing legal expert systems have been reviewed elsewhere.36 Most of these expert systems deal mainly or exclusively with statutory law. In contrast, the NSA and the HRA deal almost exclusively with case law. 4. Jurisprudential Theories and their Use for Legal Expert Systems Conflicting theories about the nature of law and legal reasoning underlie all areas of substantive law. Therefore, it is impossible to build a legal expert system without at some point confronting legal theory. ... [A]ll expert systems in law necessarily make assumptions about the nature of law and legal reasoning. To be more specific, all expert systems must embody theories of legal knowledge, legal science, the structure of rules, the individuation of laws, legal systems and sub systems, legal reasoning, and of logic and the law (as well perhaps as elements of a semantic theory, a sociology, and a psychology of law), theories that must all themselves rest on more basic philosophical 9 foundations. If this is so, it would seem prudent that the general theory of law implicit in expert systems should be explicitly articulated using (where appropriate) the relevant works of seasoned theoreticians of law. Perhaps the reason that there is, as yet, no overwhelmingly successful system is that the vast corpus of apposite jurisprudential material has not yet been tapped in the construction process.37 (original emphasis) Elsewhere, Susskind adds "[i]t is unduly restrictive to think that building expert systems in law is simply about computerizing legal reasoning: legal knowledge engineering reaches into the very core of jurisprudence and philosophy."38 It is immediately apparent to the student of jurisprudence that there is little agreement among legal theorists as to the nature of law and legal reasoning. Positions vary between the two extremes of legal mechanism - the view that all law is made up of rules - and of rule skepticism - the view that law contains no rules whatsoever.39 Most legal theorists fall somewhere in between these two extremes. The current leading theories can be generally described. Perhaps the most well known is the position of the positivists. According to Dworkin, positivism has three key tenets, the first two of which are relevant to this work. (a) The law of a community is a set of special rules used by the community directly or indirectly for the purpose of determining which behavior will be punished or coerced by the public power. These special rules can be identified and distinguished by specific criteria, by tests having to do not with their content but with their pedigree or the manner in which they were adopted or developed. These tests of pedigree can be used to distinguish valid legal rules from spurious legal rules (rules which lawyers and litigants wrongly argue are rules of law) and also from other sorts of social rules (generally lumped together as 'moral rules') that the community follows but does not enforce through public power. (b) The set of these valid legal rules is exhaustive of 'the law', so that if someone's case is not clearly covered by such a rule (because there is none that seems appropriate, or those that seem appropriate are vague, or for some other reason) then that case cannot be decided by 'applying the law.' It must be decided by some official, like a judge, 'exercising his discretion,' which means reaching beyond the law for some other sort of standard to guide him in manufaturing a fresh legal rule or supplementing an old one.40 (original emphasis) Thus, the positivist tradition holds that law consists of a body of rules supplemented where necessary by judicial discretion. Clear cases are decided by applying these legal rules. Many cases are not clear, however, and in such a situation, the facts of a case do not bring it under any one rule. It is said to be a hard case. According to the positivists, hard cases must be decided by an application of judicial discretion. Strictly speaking, unless we can predict how a judge likely will exercise this discretion, the outcome in a hard case is unpredictable. American legal realism is a second major jurisprudential theory. Realists may also accept that clear cases come under some legal rule and may be decided in accordance with that rule.41 However, the following description shows that the true emphasis of realist thought is not on rules but on goals: 1. Law is not a set of general axioms or conceptions but is "... a body of practical tools for serving specific substantive goals." 2. Law is not autonomous and self-sufficient but is "... merely a means to achieve external goals that are derived from sources outside the law ..." 3. "... [A] particular use of law cannot be a self-justifying 'end in itself.' Uses of law can be justified only by reference to whatever values they fulfill." 4. The law serves "... generally instrumental values rather than intrinsic ones. ... That is, law's function is to satisfy democratically expressed wants and interests, whatever they may be (within constitutional limits)."42 Therefore, in hard cases extra-legal considerations dictate the outcome. Thus, for realists, decisions in hard cases (and often in clear cases) are unpredictable, unless there is some means of predicting the involvement of these extra-legal considerations. There is a third position which is deserving of mention. Although Dworkin follows the positivist tradition and agrees that clear cases are governed by a legal rule, he maintains that a new element, a "principle," is used to decide hard cases. In 11 hard cases the judge must appeal to basic principles found in the law and weigh them to reach a decision.43 Thus, for Dworkin, the outcome of a hard case is potentially predictable. Nevertheless, his theory provides no criteria for guidance in balancing principles, therefore, it cannot generate rules for predicting outcomes in hard cases.44 One basic assumption must underlie all expert systems in law: legal decisions are rule-governed. A computer cannot reach any solution to a problem unless it is provided with a rule or a method which it can follow. If an expert system is to give advice in response to any given set of facts there must exist a most reliable prediction of what the court will decide, and this prediction must be reached according to some rule which can be implemented in the system. A legal research tool that purports to give an answer is of little use if no answer can really be known before the court pronounces its decision. Legal experts do predict outcomes, even for hard cases, so the basic assumption is reasonable. As discussed above, the traditional theories of the nature of law do not provide a rule-governed means of predicting outcomes in hard cases.45 In the present study, the rules dealt with are quite different from the legal rules discussed by positivists or learned by a law student in substantive law courses. The NSA and HRA are based on deep structure heuristic rules, which may be gleaned from an intensive factual analysis of cases in light of the goals of the law. The idea that law has a deep structure is suggested in comments by various writers.46 The following account briefly summarizes the analysis of Coval and Smith in their book, Law and its Presuppositions. The authors begin with a basic assumption that law is about actions performed by agents. To perform an action in a standard way an agent must have certain abilities. These are: 12 (1) the ability to evaluate the truth of empirical propositions; (2) the ability to reckon, which includes prediction, numerateness and logic and is not separable from the first ability; (3) the capacity of having goals, where this will mean that our needs and desires function in some way as background to our goals; (4) the (at least partial) ability to choose among these desiderata by ranking them according to their consequences and relative desirability, thus using (1) and (2), and thereby forming plans and having resolves; (5) the ability to set in motion, with one's body, events which tend to accomplish these objectives. An agent then is a sentient, reckoning, goal-oriented, physically effective system.48 (original emphasis) Rights, social practices, morality, "rule-governedness," and, ultimately, law have all been developed to protect and enhance these characterisitics of agency and to allow the agent to function in the world. Rules and laws exist to serve two kinds of goals: (a) (i) certainty, which requires decisiveness, clarity, publicity and authority (ii) predictability, which requires consistency, impartiality and truth, (i) and (ii) being the epistemological contributions of rules, and (b) the basic rights of agency ... and other common interests of agents such as peace, physical and economic well-being, respect, love, security and privacy. These goals protect and promote the abilities of agents to perform actions (quoted above). The a goals of certainty and predictability can be described as form or rules of practice, while the b goals concerning the basic rights of agency are substance or the content of the practice. The a goals cause the b goals and form necessary and sufficient conditions for b goals. Problems arise when goals appear to conflict with each other. For example, a conflict occurs when an a goal is satisfied with the intention of causing one b goal, but it also unintentially causes another, unwanted b 13 goal. The solution in this case is to say that the causal connection between a and b is not functioning.50 For instance, when one says "I promise to do x" one has followed the a goals to invoke the social practice of promising. If something untoward then intervenes and one does not do x because it will cause an unintended b goal it is not the case that there was no promise. On the contrary, there was a promise but it has been broken. The causal connection between a and b has broken down. The a goal (the practice or form) stands unchanged but its effects are erased.51 The substantive law of torts involves a whole collection of b goals, such as the prevention of physical harm, the protection of private property and the protection of reputation. The above discussion has described why a and b goals do not conflict but one must examine what happens when b goals come into conflict with each other. In such a case the legal precedents can show how the goals are ordered, that is, which one takes priority over others in what circumstances. If there are other cases in which the same goals were involved and that have relevantly similar facts, then the present case should be decided in the same way. It must be emphasized that the goals considered here are the goals or policy of the law discoverable through an analysis of case law. They are not societal goals which might be discovered through an analysis of the intentions of political actors or through a sociological investigation.52 Analysis of existing cases produces a set of relevant facts which can be used to generate a heuristic rule, such as the following, that reconciles conflicting legal authorities on the principle of remoteness in tort law: Physical damage to persons or property resulting from a negligent action are not too remote if they are reasonably foreseeable in the particular, or are one of a reasonably foreseeable class of injuries.53 What is reasonably foreseeable in the particular, or is of a reasonably foreseeable class, is discoverable in the case law. One of the reasonably foreseeable classes is: 14 "Nervous shock resulting in some form of incapacity, suffered as a result of an injury or risk of injury to oneself or someone else is not necessarily too remote."54 Of course the words "not necessarily" indicate a limitation and further examination of the case law yields another heuristic: Nervous shock resulting in some form of incapacity, is too remote unless it is inflicted on a near family relative as a result of witnessing or coming upon the immediate or near aftermath of a serious accident involving a family member.55 This heuristic rule underlies the Nervous Shock Advisor. This kind of rule reconciles all the decided cases. The goals of the law become useful in predicting the outcomes of new cases for which there are no closely matching precedents. In new cases, the result will probably maintain the ordering of goals found in the old cases. Thus, by using the goals of the law, outcomes are predictable, even in hard cases. The above discussion suggests that there are two possible approaches to the legal aspects of an expert system. The first is one which focusses on the legal rules found in a particular domain. This kind of system would contain recognizable legal rules and concepts (doctrine) and would ask the question: "What is the law of this domain and how does it apply to the facts of this case?" On the other hand, an expert system may focus instead on the facts of the case. This kind of system would contain rules which may (or may not) look like legal rules and which tie the facts together in accordance with a model of the legal domain. This system would ask the question: "What are the facts of this case and what are the legal consequences which flow from these facts?" The former approach shall be called the doctrine-based approach and the latter, the fact-oriented approach. 15 Chapter 2 BUILDING THE HEARSAY RULE ADVISOR 1. Technical Aspects The HRA is being built using a commercial expert system shell - M.1 (formerly available from Teknowledge Inc. of Palo Alto, California).56 The knowledge in the knowledge base is stored as a set of if-then rules. Questions are asked of the user to learn facts and a goal is provided to end the consultation with the user. Rules are searched by backward chaining until that top-level goal has been satisfied. The database of cases that will support the system's conclusions is being created as several separate files using dBase III Plus - a product of Ashton-Tate.57 Details of cases are stored in a case file by citation and a reference number. Profile sheets also containing citations and reference numbers are stored in separate files. (Profile sheets are described more fully below.) The consultation with the user generates a set of facts which form a profile of the user's case. This profile is compared to the profiles of the cases in the database to find cases which match (cases on point), or cases which vary in some particular detail from the user's case (e.g. contra cases or cases relevant by analogy). In future, the system will also be able to search the case file for cases by date or jurisdiction, and will alert the user to leading cases of which she or he should be aware. The database for the NSA was originally not built with a database management program but simply with a word processor. Additional features of that version of NSA which will probably not be implemented for the HRA are: the ability to provide the defenses that might be available on the user's facts and the full text of leading cases. The former is not useful for the HRA and the latter would be a time-consuming luxury. The knowledge base and the database will be linked by a program being written in the C programming language.58 This will require the addition of rules to 16 the knowledge base to provide the necessary information that must be sent across the link to be used in searching the database files. The link will provide at least two menus of choices for the user. The first menu will allow selection of the category of cases to view (e.g. cases on point, cases contra). The second will provide case names. The link will also handle all the search tasks for the database. All these files are being developed on an IBM PC XT with a hard drive and two double-sided floppy disk drives. The completed system will run on an IBM PC. 2. Personnel Involved This expert system embodies the knowledge of one expert in the law of evidence - Professor M. T. MacCrimmon of the Law Faculty at the University of British Columbia. The knowledge engineer building the system is myself. Professor J. C. Smith has provided expertise in legal theory. In addition, we have had the help of a number of student research assistants and technical and other advice from various people. Building this kind of expert system gives dominant roles to an expert and a knowledge engineer but without this small army of other workers it would never come into being. 3. The FLEX Methodology In building the NSA, the team of Deedman and Smith worked out the following methodology for building legal expert systems which they call FLEX (Fast Legal EXpert): 1. Define a narrow and deep universe of discourse for the advisor. 2. Build a data base by gathering all the case law which falls within the universe of discourse as far back in time as desired, and within the selected jurisdictions. Make a short brief of each case. 3. Select the goal of the advisor. 17 4. Analyze the cases in terms of the traditional doctrine as expounded in the cases and in the legal texts. 5. Organize the case data base according to the initial analysis in terms of traditional doctrine. 6. Ascertain to what degree the traditional anlysis can account for the decided cases. 7. If the traditional analysis can, in general, account for most of the decided cases, isolate out the 'hard' cases and seek underlying patterns which might account for or explain them. 8. If the traditional analysis cannot account for the decided cases, develop a second analysis which is independent of the conceptual terms of the traditional analysis, in terms of which the data base of cases can be explained. 9. Formulate the second analysis into deep structure rules or principles which would appear to explain that pattern of decisions. 10. Divide and analyze the data base of cases in terms of the second analysis. 11. Restructure the analysis and the deep structure rules until you are left with only a few random lower level decisions which may not fit the analyses. These can be assumed at this point to be wrongly decided. 12. Re-examine the cases in the data base and modify the briefs so that they include the relevant factors of the final analyses. 13. When this is done the expert then conceives of possible hypothetical example which might arise, and decides which to link into the system, and connects them to the cases which could be argued to be similar by analogy.69 This methodology forms the starting point for our work. 4. Interviewing the Expert Following is an account of how the debriefing sessions actually proceeded. This is not a recommendation of how it should be done in other circumstances. The FLEX methodology was followed, though not strictly. In fact there were no cases to work with until later on in the development of the system. FLEX methodology 18 recommends working on cases early in the process. Nevertheless, in hindsight, this initial lack of cases does not appear to have hampered work on the hearsay rule. Prior to the first session, the domain expert provided the knowledge engineer with copies of two papers on hearsay evidence to read60 and made available a computer-assisted instruction program on hearsay on which she has been working. The first session started by defining the purposes of the system. These were agreed to be: 1) To advise lawyers 2) To teach students 3) To examine the theory underlying the exclusion of hearsay evidence. It was also agreed that the scope of the project would be hearsay evidence. That covers steps 1 and 3 of FLEX. In addition, the major issues which must be addressed in determining whether evidence is hearsay were outlined. One possible approach to the exceptions was briefly discussed, and the potential of an analysis in terms of reliability of evidence when produced in the circumstances set out by the exceptions was noted.61 All this was done in an introductory fashion only and took little more than one hour. After that the focus was on the questions the system might ask a user. The second session involved discussion of a test case which raised difficult issues of defining hearsay. The facts were varied slightly to create a credible test case for the dying declaration exception to the hearsay rule. After this, flow-charts representing a possible computer approach to the definition of hearsay and the dying declaration exception were drawn up, and the first possible rules to put into the expert system were drafted. By agreement, the third session was spent on the declarations against interest exception to the hearsay rule. Some test cases were briefly discussed, but more time was spent in finding out the traditional legal conditions that must be met for a 19 statement to be a declaration against interest, and then translating those into factual questions that could be asked of the user.62 After this, three separate expert systems were begun: one for the definition of hearsay and one each for dying declarations and declarations against interest. Beyond this point (6 more sessions), flow-charts and the emerging systems were discussed. Questions for the user were refined, deleted and added and problem areas were addressed or noted as requiring future work. The test case approach was abandoned in favour of discussing either topics and questions in a more abstract fashion or specific problems in the developing systems. Towards the end of this period the three systems were combined. One session was also spent discussing evidential reasoning in a more general way using Marvin Minsky's book The Society of Mind.6Z This session probably had no immediate significance for the evolving system, but it provided a highly valuable perspective on how the expert thinks. To this point all sessions took sixty to ninety minutes and focussed on the exceptions as individual, mutually exclusive categories. After this, a break provided both parties with an opportunity for reflection. Up to this point, steps 4 through 8 of FLEX had been covered. After the break, on taking stock of the past work, it became clear that the approach to that point would produce a system which could not solve hard cases. In other words, this approach would not fill the gaps between the mutually exclusive exceptions. However, judges do not necessarily treat the exceptions as mutually exclusive and, moreover, may be willing to broaden old exceptions or create new ones.64 Thus some gaps can be filled. An additional problem with the emerging system was that it seemed inefficient and cumbersome (although differences in speed on the computer are really negligible). It would be relatively inflexible and, therefore, difficult to 20 change its knowledge base with changes in the law. Also, it had the rather annoying characteristic of proceeding through a linear chain of five or six questions. Where the user's facts were distinguished on the last question, the proceeding questions in the chain immediately became irrelevant and, thus, were a total waste of time.65 Clearly the solution to these problems was to find common ground among the exceptions and this required looking at more than one at a time. The first attempt along these lines began with extracting all the legal conditions for the exceptions to apply from Cross on Evidence.66 Some translation into more factual terms was done, particularly where an exception had been previously covered. On studying this list, it became clear that some conditions were subjective, that is, they directly dealt with the state of mind or intention of the declarant of the hearsay statement. Others were more objective, that is, they dealt with circumstances external to the declarant's mind. A few objective ones had a subjective aspect to them, since courts sometimes are prepared to make inferences as to the declarant's state of mind from objective circumstances.67 This approach was fairly quickly replaced with a reliability analysis following Tribe and Friedman (the details of the theory are discussed in Chapter 3 below). The facts which characterize each exception were taken from Wigmore on Evidence68 and fitted into the reliability theory. This initial analysis was followed up by detailed analysis of cases and refinement of the system and its questions to fit the facts of the cases. This covered steps 9 through 12 of FLEX and finally brought in step 2. All during this time, the system was kept up to date so that parts of sessions could be spent reviewing the system. A switch to a reliability analysis had little impact on the questions asked of the user, but it made large changes in the control rules of the system. Unlike the early sessions, these later sessions included a great deal of discussion of theory as opposed to facts of cases. For instance, methods of reasoning from evidence to facts in issue were discussed. In this respect a comparison with the work of Binder and Bergman was useful.69 In their analysis of facts for litigation purposes, they provide a means of generalizing from a specific piece of evidence to a fact in issue. Thus, for example, the piece of evidence - the declarant's serious injuries - can be generalized to the fact in issue - the declarant knew he was dying - as follows: People who have serious injuries know that they are dying, expecially when there are multiple wounds and lots of external bleeding, and except when there are one or a few wounds and little external bleeding. The 'especially when' and 'except when' clauses can be filled in from case law or from common-sense experience. These then suggest factors to watch for in analysing cases. Thus the Binder and Bergman method provided some guidance. Nevertheless, although this method would have provided a more detailed and rigorous analysis, there was a strong tendency to skip this and simply apportion pieces of evidence to their appropriate facts on the basis of the expert's knowledge or common-sense. Some discussions were held with the builders of the NSA to take advantage of their experience. Also, some time was spent reviewing the performance of the NSA and discussing the goals of the law of nervous shock. As more cases became available the sessions became increasingly concentrated on the facts of these cases as the system was revised so that it explained as many of the decided cases as possible. This occupied the last sessions and will continue to be the major focus in the future. An additional issue to address in future is the place of certainty factors in the system. Step 13 of FLEX remains to be performed. Generally a schedule of one or two sessions a week, usually two, was followed. Several breaks were necessary primarily due to other demands on the time 22 of both parties. Later sessions became longer as they became more interesting but did not go over two hours. Records kept of this procedure include the following: 1/ Notes of the sessions with the expert, 2/ Copies of flow-charts and diagrams, 3/ Printouts of the system at each stage of development whenever a significant change was made. This kind of documentation allows a review of past work or a return to an earlier version of the system if the path being pursued turns out to be unproductive. All records should be dated or it becomes difficult to look back and review progress. It is now clear that the goals of the system must be revised. Since the definition part of the system involves additional issues which are outside the scope of this project, the exceptions have become the main focus. That means that the third goal articulated in the first session has changed and now reads: To examine the theory underlying the admission of otherwise unacceptable hearsay evidence. 5. Case Research For the most part, law students were hired to research cases for the system. These students were allowed to approach the problem any way they pleased. Nevertheless, the procedure set out below was usually followed. a) Scope of research - Initially the students were told to focus only on the appeal court cases in the belief that there would be plenty of those and that they would be the most useful. This proved to be a mistake. Appeal cases usually contain the authoritative statements of the law, but for this kind of system, it is facts which are most important and these tend to be found in the trial judgments. Moreover, contrary to expectations, there are far too few appeal cases to provide a good 23 analysis. Thus the search had to be expanded to include reported cases from all court levels. Unreported cases were generally avoided, however, some of these were researched for the business documents exception to the rule against hearsay. In addition, the search was initially confined to Canadian cases only. This also proved to yield too few cases in some instances so English cases were added. Cases from elsewhere in the Commonwealth have not been included. Research was divided among the students along the traditional exceptions to the hearsay rule. Thus one student researched all of one exception at one time before going on to another. b) Master case lists - The expert for the project is a professor of evidence at the University of British Columbia and was able to provide the students with materials she had prepared for her evidence class which contained the leading cases in the area. Additional cases were gleaned from textbooks, digests, and abridgements. These sources were used to make up master lists of cases to be read. All sources used were noted so that future researchers do not cover the same territory. c) Key questions - Work on the system preceded most of the case research. Therefore, a list of the questions which the system was asking could be provided to the students. This guided them in reading and summarizing the cases, and was found to be necessary in spite of the danger of biasing their interpretation of the cases. A printed page containing these questions was attached to each written case summary and the questions were answered on this page. This provides the first profile sheet of a case. Students were asked not to confine themselves to this profile sheet, but to watch for additional facts not included on the sheet which seemed important in the cases. 24 d) Selection of test cases - Each case on the master list was assigned a number and 20% of the total cases of any one exception was selected out randomly to be saved for possible use later to test the system. These cases were not included in the analysis that follows. The random number tables used were taken from a psychology textbook.70 e) Matrix - A two dimensional matrix proved to be most useful for analyzing the cases (see the example in Table 1). The coded questions identify the columns of the matrix and the numbers of the cases identify the rows. Each cell of the matrix contains the answer to the coded question that was evident in the case. These answers are usually Y (yes) or N (no), sometimes also U (unknown) and, more rarely, a number signifying a choice. A final column indicates the result in the case. The cells are then examined to see if all or most cases with the same facts have the same result. If not, then the questions must be reworded. If so, then the profile sheet can be entered into the database. An example of this matrix is included in Table 1. In this form, it explains most of the dying declaration cases, though some stand out as not fitting the pattern (for example, case no. 44). Table 1. Questions and Cases to Determine Settled, Hopeless Expectation of Death for Dying Declaration Exception NO.a INJb DEC-BEL0 DEC-TOLDd A-De BY-BELf SHE8 VERDICT 6 1 1 N U Y GUILTY 27 1 1 N O Y GUILTY 22 1 N N 0 Y GUILTY 31 1 N N O Y GUILTY 18 2 3 Y A DR Y U 19 1 2 Y A DR Y GUILTY 9 2 2 N DR&O Y U 13 1 1 N U Y GUILTY 15 1 1 Y A DR&O Y GUILTY 25 1 N Y DR Y GUILTY 3 2 1 Y A DR Y GUILTY 1 2? Y A U Y U 24 2 1 N U Y NOT G 8 3 Y A DR Y GUILTY 20 3 1 Y A DR&O Y U 26 3 1 Y A DR Y GUILTY 40 1 1 N O Y GUILTY 30 2 1 N DR&O Y NOT G 41 3 1 N Y Y NOT G 43 2 1 N U Y GUILTY 45 1 1 N O Y GUILTY 47 2 1 Y DR&O Y GUILTY 10 3 1 Y A DR&O Y U 48 2 1 Y DR&O Y GUILTY 54 3 1 Y A DR Y GUILTY 51 3 1 N U Y GUILTY 57 2 1 N U Y GUILTY 46 2 1 N DR&O Y GUILTY 56 1 1 Y A DR Y GUILTY 55 3 1 Y? U Y NOT G Table 1 Continued on page 26 26 Table 1. Continued NO.a INJb DEC-BELC DEC-TOLDd A-De BY-BELf SHEg VERDICT 4 3 2 N U N U 21 1 N N DR&O N? GUILTY 34 2 N N N N U 11 2 2 N O N U 29 2 2 N U N NOT G 5 2 2 N u N GUILTY 14 1 N N u N U 39 3? 3 U u N NOT G 7 2 N N N N GUILTY 35 2 N N u N U 16 3 2 N u N NOT G? 37 2 3 Y D DR N NOT G 36 3 2 N N N NOT G 38 3 N Y A DR N NOT G 42 3 2 N DR&O N NOT G 23 2 N N U N U 44 1 1 Y A DR N GUILTY 52 1 N N U N? NOT G 50 3 N N DR N NOT G 23 2? N N O N NOT G? 49 2 1 N O N GUILTY 53 2 1? N U N NOT G a - Case number b - Type of injuries: 1 - multiple injuries, heavy bleeding, rapid death; 2 - one injury, little bleeding, slower death; 3 - other, e.g. complications from an illegal operation, poison, drowning. 0 - Declarant's belief that she was dying: 1 - she stated positively she was dying; 2 -she said she was dying but qualified it; 3 - she otherwise indicated she believed she was dying; N - no indication she believed she was dying d - A Doctor told the declarant she was dying: Y - yes; N - no. e - The declarant acknowledged the Doctor telling her she was dying: A - agreed; D - denied. f - Bystanders believed the declarant was dying: DR - Doctor believed; O - others believed. 8 - Court concluded settled, hopeless, expectation of death: Y - yes; N - no. h - The verdict in the case. Throughout, U = Unknown. f) Revision - If the profile sheet does not explain all the cases then questions must be reworded or questions must be added. This requires a review of all the cases to find the answers for a reworded question or to pick out additional facts that may be important. For instance, an early question that was asked for the dying declarations was whether the declarant had serious injuries. This required only a "yes" or "no" answer. On examining the matrix, it became apparent that sometimes a "yes" answer went with admissibility, but sometimes it did not. Therefore the question was reworded so that the user was given a choice of three types of injuries which vary according to severity. On changing the necessary cells in the matrix, it becomes apparent that a type 1 injury (multiple wounds, heavy bleeding, rapid death) is almost always admissible. A type 2 or 3 injury may or may not be admissible depending on other circumstances. This kind of revision must be repeated until as many of the cases as possible become explainable. Cases which cannot be fit into the matrix perhaps are wrongly decided.71 After the last revision, the profile sheet can be finalized, the database can be set up, and the wording of the questions in the system can be settled. g) The database - The database contains two kinds of files. The first is a case file and contains the briefs of all the cases. Briefs prepared by students under supervision are used rather than full texts, simply because the latter would take so long to enter into the database. In addition, in evidence cases the evidentiary issue may be only a small part of a very long case. For the HRA, briefs tend to provide fairly full accounts of the facts of the cases and a short summary of the law according to the judge. Students are not required to quote rigorously the actual words of the judge, nevertheless, they tend to do so. The second file is a profile sheet file. There is one such file per exception and one profile sheet per case. The profile sheet file contains fields corresponding 28 to the questions asked by the system. The fields are filled with the answers for the particular case. Each profile sheet has a reference number and the name of the case so that it can be cross-referenced to the large case file and the brief can be found. It is the brief that is to be displayed for the user, not the profile sheet. h) Problems - The most vexing problem in relation to case research has been a dearth of facts in the case reports. This makes some cases quite useless. These can be ignored depending on the jurisdiction but appeal cases cannot be so treated. For example, one of the cases on dying declarations is R. v. Debortoli, a case of a dying declaration decided on appeal by the Supreme Court of Canada.72 This case was impossible to fit into the matrix because of a lack of facts in the judgment which gave a very distorted view of the case. On retrieving the Crown brief of the trial case from the B.C. Provincial Archives, many more facts were found which could be used to bring the case into line with the others in the matrix. Selectivity of written judgments with respect to facts may be a real problem. Hopefully, judges always record the facts which influence their decisions, but it is doubtful that there is any solid basis for believing that they do so. With respect to this problem, the old English hearsay rule cases tend to be much more skeletal in their facts. This varies with the reporter.73 More modern cases with written judgments tend to at least appear to be more complete. If a case such as R. v. Debortoli has no value as a precedent, possibly it could be ignored and left out of the database. However, this particular case cannot be so treated since it is a decision of Canada's highest court. Therefore, a note will be included with it in the database showing the facts obtainable from the Crown brief. Another problem in case research is that admissibility of evidence problems tend to be reported in isolation, usually because they are decided in a voir dire. Thus, there is rarely any means of finding out what other evidence was present in the case. Yet, if there is other evidence already admitted on which a judge or jury may render a decision in the case, hearsay evidence will be excluded, even if it satisfies an exception, because its trustworthiness cannot be tested.74 This follows from the best evidence principle, discussed below in Chapter 3. Although the FLEX methodology was not followed precisely, most of the steps were satisfied in roughly the same order as they are set out. Ideally one would begin the exercise with an existing data base of cases. Nevertheless, lack of cases turned out not to delay development of the system. In reality, case analysis and development of the system must go hand in hand. One problem with working from an existing database, is that it would not have been created with expert systems purposes in mind and might not have been useful. In normal case analysis the emphasis is on judges' statements of the law. For expert systems purposes it is the facts which are of interest. Chapter 3 THE HEARSAY RULE ADVISOR 1. Theory of the Hearsay Rule Advisor A hearsay statement is a statement offered in evidence to prove the truth of the matter asserted but made otherwise than in testimony at the proceeding in which it is offered. A statement is an oral or a recorded assertion and includes conduct that could reasonably be taken to be intended as an assertion.75 As a general rule, subject to many exceptions, hearsay evidence cannot be used in court. This is partly because the evidence is not made under oath in circumstances where the trier of fact can observe the witness and assess her or his credibility (i.e.-it is an out of court statement). More importantly, because a hearsay statement is not made in court, there is no opportunity to test its reliability or trustworthiness. The adversarial system assumes that the ideal means of testing evidence is cross-examination in court by opposing counsel.76 The HRA is explained in flowchart form in Figures 1 through 6. These figures follow the system through its consultation with the user showing the questions asked and the conclusions drawn. Figure 1 shows an overview of the whole system; Figures 2 through 6 show each part of the system in more detail. The internal structure of the system is explained diagrammatically in Appendix A. Appendix B contains the knowledge base for the system. The first part of the HRA (Fig. 2) deals with the definition of hearsay and answers the question: "Is this statement hearsay or not?" As can be seen from the definition quoted above, the purpose for which the statement is to be used in court determines the answer to this question. In other words, the statement may be used for one or more of three purposes: as proof of the truth of its contents, as proof that it was made, or as proof of a fact that can be implied from its contents. This issue is central to the legal concept of relevance. Evidence which is logically probative is relevant. Relevance of a piece of evidence also involves its relationship to other evidence in the case. In addition, materiality may be included within a concept of relevance.77 31 Fig. 1. Overview of the Hearsay Rule Advisor Consult the definition? I \ yes •definition \ rule does not apply \ no 1 rule applies More than one declarant? \ end no yes Business Documents Declarant available? I \ no Ves t Business Documents Type of trial? / criminal Dying Declarations Spontaneous Declarations (not yet Implemented) yes • end civil -no-oral, conduct Declarations In Course of Duty \ »• Type of evidence? / I written I Business - Documents -no no / Declarations Against Interest ^-no^ yes end yes exception » none end end 32 Fig. 2. Preliminary Questions and Definition of Hearsay Introduction More than one declarant? Sex and name of declarant yes Consult the definition? .no-Type of evidence? •yes^^ Type of evidence? \ Declarant available to testify? 1 * Relevant for a non-hearsay purpose? (examples) Declarant available to testify? * Relevant for fact asserted? Rule applies I * Relevant for fact implied? \ Rule does not apply CONCLUDE HEARSAY To exceptions Figs. 3-6 end 33 Fig. 3. Dying Declarations Exception If one declarant and criminal trial and declarant dead then • Declarant's death is subject of charge? Declarant actually dying? Statement Identifies murderer? i CONCLUDE Event = Dying Declaration — Declarant competent? i Statement is facts? Evidence to show no perception? i  CONCLUDEI Perceive = Dying Declaration •—• CONCLUDE Believe = Dying Declaration Type of injury? j Declarant's belief? yes"" no Qualified T belief? —Doctor told her she was dying Bystanders' belief? Settled hopeless expectation of death i  CONCLUDE [ Intend = Dying Declaration CONCLUDE [Exception = Dying Declaration 34 Fig. 4 Declarations Against Interest Exception If one declarant and declarant dead then j Existing legal obligation? Forgo a legal benefit? I Limited interest in property? I Exposure to liability? Other financial loss? I Statement Is facts? +  CONCLUDE Event = Declarations Against Interest r-= Declarant had personal knowledge of facts reported? I Loss was existing at time of statement? Declarant knew against Interest? ( i !l: • CONCLUDE [Perceive = Declarations Against Interest s CONCLUDE Believe = Declarations Against Interest T= Statement to be used for part only? I On the whole, statement is against Interest? CONCLUDE Intend = Declarations Against Interest I CONCLUDE [ Exception = Declarations Against Interest 35 Fig. 5. Business Documents Exception If evidence Is written then I Document produced as declarant's normal function? i CONCLUDE Event = Business Documents Declarant's function to make record? I Declarant had personal knowledge of matters reported? I Declarant was informed by people with personal knowledge? j Declarant would be competent to testify? I Document relied on in the business? i CONCLUDE Perceive = Business Documents r-™ Record made contemporaneously with matters reported? _i CONCLUDE Believe = Business Documents Record compiled for business purposes? i CONCLUDE Intend = Business Documents Jurisdiction of action? i CONCLUDE Exception = Business Documents 36 Fig. 6. Declarations In Course of Duty Exception If one declarant and declarant dead then * Statement made in course of duty? I  CONCLUDE Event = Declarations in Course of Duty ^„ [If evidence is written then use Perceive = Business Documents otherwise] Declarant had personal knowledge? I Statement made as regular part of job? I Declarant had experience? Others relied on statement? . i , CONCLUDE | Perceive g Declarations in Course of Duty j 7 Record made contemporaneously with matters reported? i CONCLUDE I Believe = Declarations In Course of Duty No evidence to show tampering in record? I No motive to misrepresent? +  CONCLUDE I Intend = Declarations In Course of Duty CONCLUDEi Exception = Declarations in Course of Duty This part of the system simply uses the three purposes set out in the preceding paragraph as questions for the user to determine the use intended for the evidence (these are the three questions marked with * in Fig. 2). It seems relatively straight forward. Nevertheless, the dividing line between using evidence as proof of the truth of its contents, and using it only to show that it was made, is controversial and elusive. For this reason, the system includes examples of situations where a statement is used only to show that it was made. The examples may be perused if the user wishes. A further problem exists with determining whether evidence is to be used for a fact that can be inferred from it. Whether or not this is a hearsay use is undecided. It has become clear that this part of the system will never be satisfactory until a deep structure analysis of relevance is carried out. The definition of hearsay turns out not to be a discrete area of the law but leads directly and unavoidably into relevance. Thus, work on this part of the system has been abandoned in favour of concentrating on the exceptions. The second part of the HRA (Figs. 3-6) deals with those situations in which evidence which is to be used for a hearsay purpose is nevertheless admissible. Reliability is the key to the examination of these exceptions and is accepted as well by other authorities.78 First, however, an older, also popular, approach should be noted. Wigmore determined that two principles governed the admissibility of these exceptional hearsay declarations, namely, necessity and a circumstantial guarantee of trustworthiness. According to Wigmore, a circumstantial guarantee of trustworthiness is the practical equivalent of cross-examination, that is, ... the probability of accuracy and trustworthiness of statement is practically sufficient, if not equivalent to that of statements tested in the conventional manner.79 Wigmore goes on to find that there are three reasons why the exceptional circumstances should be sufficient substitutes: 38 a. Where the circumstances are such that a sincere and accurate statement would naturally be uttered, and no plan of falsification be formed; b. Where, even though a desire to falsify might present itself, other considerations such as the danger of easy detection or the fear of punishment would probably counteract its force; c. Where the statement was made under such conditions of publicity that an error, if it had occurred, would probably have been detected and corrected.80 These three reasons all suggest that the main concern is that the declarant had a motive to tell the truth or no opportunity to lie. In other words, the declarant was sincere. Wigmore's first principle - necessity - "implies that since we shall lose the benefit of the evidence entirely unless we accept it untested, there is thus a greater or less necessity for receiving it."81 Of course, where a declarant is dead or unavailable, this necessity principle would come into play. It is also involved where the original hearsay evidence is for some reason deemed more valuable than having it repeated later on the stand (for example, spontaneous declarations and business documents). In these cases the necessity is less, perhaps it is really efficiency or convenience. What Wigmore fails to emphasize is that the necessity for using this evidence arises because there is no other evidence, or none better, in other words the hearsay evidence is the best evidence available.82 This is evident in some of the passages that Wigmore himself cites from cases to support his two principles.83 Wigmore's second principle is roughly equivalent to a reliability analysis. His first principle applies where the unavailability of the declarant is required for an exception to apply (e.g. dying declarations). However, the necessity principle as a best evidence principle can keep reliable evidence out. Analysis of the case law shows that the vast majority of dying declarations cases can be explained using solely the reliability analysis, but a small handful of cases that exclude hearsay evidence that seems reliable remains. Some of these raise an implication that there was better evidence already before the judge when the hearsay statement came up for consideration. For example, in one case, a dying declaration which certainly should have been admissible on a reliability analysis was kept out, but the accused was convicted anyway, thus raising the probability that there was other, better evidence already admitted before the dying declaration was submitted.84 It is hard to test a best evidence principle in the decided cases because it requires a knowledge of the other evidence in the case, something that is rarely available in the case reports. Nevertheless, it is a simple matter to ask the user about, and it can be tied into the system by implications drawn in decided cases where evidence is kept out but the decision goes in favour of the side which wanted it admitted anyway. The goal of the rule against hearsay evidence is truth. Normally, hearsay evidence is excluded because it would have a negative value in discovering the truth, that is, excluding it has a positive value for truthfinding. However, in exceptional circumstances, the evidence has a positive value greater than the value of excluding it. These are circumstances where it is the only evidence, or the best evidence, available and it passes a certain threshhold test for reliability. Thus, in questionable situations, the choice of admission or exclusion of the evidence is made so as to maximize the chances of finding the truth. Since the rule and its exceptions exist to maximize truthfinding, this goal is now built-in to the HRA. Consideration of the effect of a decision on truthfinding will help solve hypothetical cases used to test the system. More importantly, it will allow anticipation of the development of new exceptions by facilitating inclusion of a basket exception to take care of situations which do not fit the traditional exceptions but which have an appearance of reliability. Examples of this kind of basket exception are found in the United States Federal Rules of Evidence,85 Section 804(b) reads as follows: (b) Hearsay exceptions. The following are not excluded by the hearsay rule if the declarant is unavailable as a witness: (5) Other exceptions. A statement not specifically covered by any of the foregoing exceptions but having equivalent circumstantial guarantees of trustworthiness, if the court determines that (A) the statement is offered as evidence of a material fact; (B) the statement is more probative on the point for which it is offered than any other evidence which the proponent can procure through reasonable efforts; and (C) the general purposes of these rules and the interests of justice will best be served by admission of the statement into evidence. However, a statement may not be admitted under this exception unless the proponent of it makes known to the adverse party sufficiently in advance of the trial or hearing to provide the adverse party with a fair opportunity to prepare to meet it, his intention to offer the statement and the particulars of it, including the name and address of the declarant. Hearsay evidence (in fact any testimonial evidence) can have one or more of four infirmities: 1. The witness failed to correctly perceive what she or he reported. 2. The witness did not correctly remember or believe what she or he saw. 3. The witness did not intend to correctly report it. 4. The words the witness used are ambiguous in meaning.86 When the witness is in court, cross-examination will expose these infirmities. However, in a hearsay situation the declarant is not in court, so there is no direct way to test the evidence. Nevertheless, in certain circumstances the law has accepted hearsay statements. In such cases, it is because there is evidence to show: 1. That the witness correctly perceived the incident reported in the hearsay statement (perceive), 2. She or he accurately believed and remembered what she or he saw at the time of communicating the belief (believe), 3. She or he intended to communicate that belief honestly when making the hearsay declaration (intend). Ambiguity must be omitted because there is no means of finding out what were the exact words in the statement, since natural language processing is not available. Where perception, belief and intention to tell the truth are all present the evidence meets a threshold test of reliability. Once admitted, further assessment of how reliable it actually is goes to weight, and is for the trier of fact to determine.87 Case analysis confirms that the reported facts can be viewed as going to one or more of perception, belief or intention. Perception and belief tend to each involve only one major question. Perception depends mainly on whether the declarant had personal knowledge of the matters contained in the statement. Belief depends mainly on whether the declarant made the statement during the time when the circumstances set out in the statement existed. Most facts selected by the court as significant go towards intention. The majority of inadmissible cases are such because of a lack of intention. Rarely is a hearsay statement inadmissible because of problems with perception or belief. This confirms Wigmore's emphasis on motive to tell the truth or opportunity to lie and the analysis based on Cross which showed the importance of the subjective element. The approach of the HRA is to take the exceptions as paradigmatic cases of reliability. The user is first questioned to find out which of the paradigms his case most closely matches. In other words, the event that caused the hearsay declaration is determined. "Event" refers to the set of circumstances surrounding the making of the statement. This may be either an occurence that triggers the making of the statement (e.g. dying declarations), or an exising state of affairs (e.g. declarations against interest or in course of duty). Once the event is determined, the user is presented with further facts, tailored to fit the event, which indicate the conditions of perception, belief and intention are satisfied. The consultation must be keyed to the event in order for the system to appear intelligent. In other words, having determined that the event is a dying declaration, it makes no sense to ask a question from the business documents example such as: "Did the declarant have a duty to record the information in the statement?" Keying the questions to the event also allows an assessment of the whole story for its credibility.88 From a larger perspective, the problem is simply one of classification. The exceptions are set out in a classification scheme, and the user's case is fitted into the scheme under the appropriate exception. Thus, the knowledge in the system consists of a set of potential facts the user may have. As questions are answered, the potential set of facts is narrowed to the actual set. These are linked together to allow the system to draw conclusions as to perception, belief and intention. When all three are found to exist the system gives the user a conclusion of admissibility. As can be seen from Figures 3 through 6, the system draws conclusions as to event, perceive, believe and intend in that order. In some instances certain questions help to satisfy more than one condition. For example, a single question from the event portion of the dying declarations exception also satisfies the belief condition (Fig. 3). The declarations against interest exception (Fig. 4) has more of these multi-use questions. In other cases, the same questions will help satisfy the same conditions across different exceptions. For example, all the exceptions require that the hearsay statement reflect facts rather than opinion, and that the declarant would have been competent to testify if called to the stand. The business documents exception (Fig. 5) and the declarations in course of duty exception (Fig. 6) show considerable overlap when the hearsay statement involves a written record. If the statement is written, the system will first consult the business documents exception. If it can satisfy perceive but fails on believe or intend, and if the declarant is not available to testify, it will go on to consider the declarations in course of duty exception. Perceive will have already been satisfied so the system will skip straight to a consideration of believe. If the evidence is not written the business documents exception will not apply but the declarations in course of duty exception may be useful. Thus the system is provided with an alternate set of questions to determine whether perceive is 43 satisfied. The more multi-use questions there are in the system, the fewer dead-end paths there will be. For example, as the system presently exists, most of the questions in the dying declarations exception apply only to that exception. Thus, if the user's case fails to match the exception on one of the last questions in the list, a dead end is reached. Most of the questions asked earlier in the consultation generated no information that would be useful in considering another exception. If the user's facts fit one of these traditional exceptions to the hearsay rule, that exception name is saved by the system to be used in accessing a database of cases. If no traditional exception is found but the evidence is reliable, it may be possible to access a traditional exception that is somewhat analogous to the user's facts and recommend that arguing for broadening of this exception or, alternatively, it may be possible to argue for creating a new exception. This would be recommended if the evidence satisfies the threshold level of reliability and thus has a positive value for truthfinding. (This part of the system has not yet been implemented.) The hearsay system does make use of heuristic rules and will make use of more in future. These rules help to classify the user's facts into the examples provided by the traditional exceptions to the hearsay rules. For instance, preliminary questions ask the user whether there was one declarant or more than one, whether the trial is civil or criminal, and whether the evidence is oral, written or conduct. Three of the heuristic rules generated from these questions are as follows: If there was more than one declarant and the evidence is written then the exception to consider is the business documents exception. If there was one declarant and the trial is civil and the evidence is written then the exceptions to consider are declarations against interest, business documents, and declarations in course of duty. 44 If there was one declarant and the trial is criminal and the evidence is oral then the exception to consider is dying declarations. There are two heuristic rules underlying the whole system which can be stated as follows: 1. Hearsay evidence is not admissible in court unless it is the best evidence available and it passes a test for a minimum level of reliability. 2. Where there is evidence to show that there was perception, belief and intention to tell the truth on the part of the declarant, a conclusion that the hearsay statement is reliable may be drawn. This reliability analysis promotes flexibility in the system, that is, it helps to make it easier to change it with a change in the law. Grouping facts under headings such as perceive, believe, intend, and event allows for modular design of the system. If one condition for admission of evidence under a particular exception changes, that condition is easily found and need be altered in only a very few places. In addition, visualizing the system as a rather straightforward classification scheme makes it easier to explain to others and, hopefully, easier for them to understand. Classification schemes are not unique to law. There is one problem with this kind of analysis. Although case law may have found that evidence produced in certain circumstances is reliable, some of these circumstances intuitively seem to have nothing to do with reliability. This is most clearly seen with the dying declarations exception. The usual justification presented for believing the declarant told the truth is that dying persons would not wish to go to their Maker with a lie on their lips.89 Therefore it is crucial to show that the particular declarant knew she or he was dying. However, it has rarely been important in the cases to show that the declarant believed in God and an afterlife.90 Given that most dying declaration cases come from the nineteenth century and very early twentieth century, it may have been a reasonable justification at one time, but it hardly seems sensible now. Criticism has also been leveled at spontaneous declarations. Supposedly the declaration is made so quickly in such traumatic circumstances that the declarant has no time to formulate a lie. However, since the circumstances are traumatic, perhaps the declarant did not perceive what was happening accurately, but simply blurted out the first thing that came to mind whether it actually expressed part of the event or not.91 The common law appears to assume that in the case of a dying declaration there is a connection between a settled, hopeless expectation of death and an intention to tell the truth. However irrational this may seem, there is nothing in the cases to suggest that the assumption is actually incorrect and that there is a deeper, hidden justification for the exception.92 Wigmore suggests that "common sense and experience" have brought the exceptions to light as examples of reliable hearsay evidence.93 Evidently common sense and experience may change with time and modern views may be quite different from nineteenth century beliefs. Nevertheless, the law is slow to catch up. It should be noted that this implementation of a reliability analysis omits consideration of the ambiguity problem. Since natural language processing is not available, it is not possible to assess the declarant's words for their meaning. Wherever possible the HRA avoids referring to the actual words, but, on occasion, the user must be asked for an assessment of the meaning of the statement. This has been a particular problem in the dying declaration cases where the declarant's statements as to a belief that she or he is dying are very important. The cases often make a fine distinction between the statements "I'm dying" and "I think I am dying" on the basis that the former expresses an absolute belief, while the latter expresses a qualified or less than absolute belief. It is very difficult to design a question so that the user can draw this distinction accurately. This will introduce a source of error in the system. 46 2. Sample Consultations a) Example - definition and dying declaration The following fact situation is used in the sample consultation that follows:94 The accused, Mr. M, is charged with the murder of his wife. A neighbour heard a shot in the M house and went to investigate. She found Mrs. M by the swimming pool. Mrs. M said to the neighbour, "Get the ambulance quick. I'm dying. My husband shot me." The neighbour called the ambulance. Mrs. M died on the way to the hospital. There was only a single bullet wound and little external bleeding but the neighbour will testify that she believed Mrs. M was dying. In the following consultation the computer's questions are labelled with 'Q' and a number. The user's answers are capitalized. Ql Is this evidence a statement by a party (including vicarious statements) offered by an adverse party or a statement by the accused offered by the Crown? yes no NO Q2 Was the statement made by one person, more than one person or is the maker of the statement unknown? 1. one-person 2. more-than-one-person 3. maker-unknown 1. ONE-PERSON Q3 Is the person who made the statement a woman or a man? 1. she 2. he 1. SHE Q4 What name should I use to refer to this person? "Mrs. M" 47 Q5 Would you like me to help you decide whether or not you have a hearsay problem? If not, I will assume that you do have a hearsay problem and will proceed to examine the exceptions to the hearsay rule. yes no YES Please think about the proposition you want to establish in your case. Consider what it is that you want to prove. Q6 What type of evidence do you have? 1. oral assertion 2. written or recorded assertion including computer printouts and tape recordings 3. assertive conduct 4. non-assertive oral, written or recorded statement or conduct 1. ORAL ASSERTION Q7 Are you able to produce Mrs. M in court to testify? yes no NO Recall the proposition you are trying to establish and please keep it in mind when you answer the next questions. Q8 To prove the proposition, do you wish to use the statement to show that it was made (a non-hearsay purpose)? (Select the 3rd option to see examples of cases where evidence that would otherwise be hearsay is admissible for this non-hearsay purpose.) 1. yes 2. no 3. examples NO Q9 To prove the proposition, do you need to use the statement for the truth of its contents? yes no YES Q10 As proof of your proposition, do you need to use the statement for an inference that can be drawn from it? yes no NO 48 I conclude that you wish to use your evidence for a hearsay purpose. A hearsay statement is a statement offered in evidence to prove the truth of the matter asserted but made otherwise than in testimony at the proceeding in which it is offered. — Uniform Evidence Act, Report of the Federal-Provincial Task Force on Uniform Rules of Evidence, 1982. Qll Earlier in this consultation, you told me that Mrs. M is not available to testify in court. Why is she unavailable? 1. dead 2. other reason 1. DEAD Q12 What type of proceedings are you involved in? 1. civil 2. criminal 2. CRIMINAL Q13 Was the death of Mrs. M the subject of the charge? yes no YES Q14 Was the statement made after the injuries occurred? yes no YES Q15 Does the statement identify the person who caused Mrs. M's death? yes no YES Q16 If Mrs. M were called in court to testify would she be competent to testify? yes no YES Q17 Does the statement reflect facts (as opposed to being opinion)? yes no YES 49 Q18 Is there anything to suggest that Mrs. M might not actually have seen what happened (e.g. she was shot in the back)? yes no NO Q19 Into which category would you place the injuries: 1. fatal wound(s) with heavy external bleeding 2. fatal wound(s) with little external bleeding 3. all other injuries - e.g. drowning, poisoning, wounds from a fist fight, complications from an illegal operation 2. FATAL WOUND WITH LITTLE EXTERNAL BLEEDING Q20 Did Mrs. M believe she was dying? yes no YES Q21 Did Mrs. M qualify this belief in any way so as to indicate that it was not an absolute belief? For example, she said, "I think I am dying." yes no NO Q22 Did a doctor tell her that she was dying? yes no NO Q23 Did people around her believe she was dying? yes no YES These last questions have helped me decide that the declarant had a settled, hopeless expectation of death when the statement was made. Case law requires that this be shown in order for the dying declaration exception to apply. I have determined that the dying declaration exception will apply in your case. These questions still require some refinement in their wording. Also, the conclusions will be made more detailed and elaborate. As well, on these facts the system would 50 go on to also consider the spontaneous declarations exception. The final step is to add case authority for the decision. This will be cases that match the user's facts and cases that have similar facts but reached the opposite conclusion. b) Example - business documents As an additional example, consider the records produced in hospitals with respect to patients. Nurses' written notes in one of these records was the hearsay evidence to be admitted in the case of Ares v. Venner.95 This was a malpractice suit. The nurses notes recorded symptoms which should have alerted the defendant doctor to complications in his treatment of the plaintiff. All of the nurses who made notes could have been called to testify. The following consultation shows how the HRA would handle this problem. Ql Is this evidence a statement by a party (including vicarious statements) offered by an adverse party or a statement by the accused offered by the Crown? yes no NO Q2 Was the statement made by one person, more than one person or is the maker of the statement unknown? 1. one-person 2. more-than-one-person 3. maker-unknown 2. MORE-THAN-ONE-PERSON From this point on I shall refer to the makers of the evidence as 'the declarants.' Q3 Would you like me to help you decide whether or not you have a hearsay problem? If not, I will assume that you do have a hearsay problem and will proceed to examine the exceptions to the hearsay rule. yes no NO The next step is to determine whether an exception to the hearsay rule will apply in your case and allow your evidence to be admitted. 51 Q4 What type of evidence do you have? 1. oral 2. written or recorded including computer printouts and tape recordings 3. conduct 2. WRITTEN Q5 Are you able to produce the declarants in court to testify? yes no YES Q6 What type of proceedings are your involved in? 1. civil 2. criminal 1. CIVIL Q7 Was the statement made as a normal part of the function of the declarants in the business organization with which they are connected? yes no YES Q8 Was it part of the declarants' functions in the organization to record the information in the statement? yes no YES Q9 Did the declarants themselves actually perform or observe the actions reported in the statement? yes no YES Q10 Were the declarants and any other persons who passed on information competent to testify? yes no YES 52 Qll Was the statement relied upon in the day to day affairs of this (or another business)? yes no YES Q12 Was the report made contemporaneously with the things reported? yes no YES Q13 Was the statement made before the controversy arose? yes no YES Q14 Does the action you wish to use this evidence for fall under federal or provincial (B.C.) jurisdiction? 1. federal 2. provincial 2. PROVINCIAL I have decided that your evidence is admissible under the business documents exception as implemented by section 48 of the British Columbia Evidence Act. Q15 Were the declarants simply compiling the record for regular business purposes or were they actually conducting an investigation into historical facts? 1. business purposes 2. investigation 1. BUSINESS PURPOSES I have decided that your evidence is also admissible under the common law business documents exception as set out in Ares v. Venner. Chapter 4 SELECTED ASPECTS OF EXPERT SYSTEMS IN LAW 1. Nervous Shock Advisor and Hearsay Rule Advisor Compared The NSA and the HRA both take a fact-oriented, deep structure approach to their respective domains.96 This demonstrates that the method works well for case-based law and applies across different legal domains. Beyond this, however, there are significant differences between the two systems in details. With respect to theory, the domain of the NSA is an area in which there are conflicting legal rules, namely, remoteness in negligence law. The domain of the HRA, however, involves no such conflict. In addition, the goals of the law of nervous shock, and torts law in general, are the b goals of prevention of harm and maximizing of personal freedom. In hard cases and hypotheticals, these goals need to be ordered and their priorities determined. In contrast, the HRA only involves the one a goal of truth. Certainly, these differences must have influenced the discussions between the knowledge engineer and the domain expert. As well, the difference in goals is now built in to the systems. Considering deep structure, "deep" is much closer to the surface for the HRA than for the NSA. That is, commentators and judges are willing to mention reliability in hearsay law. In the law of nervous shock, discussion tends to concern not deep structure, but remoteness. One author does suggest that the hearsay rule 97 has a deeper structure yet. Nevertheless, the work on the hearsay cases did not reveal any such structure that might better explain the decisions. The systems also differ somewhat in their appearance to the user. The HRA appears to be asking about much more detailed facts than the NSA. This is natural because the NSA covers, if not the whole trial, then at least a major issue. The HRA only deals with one single piece of evidence out of all the evidence that is used to establish the elements of the cause of action in, for example, nervous shock. Since natural language processing is not available, the HRA faces certain problems which do not confront the NSA. The NSA concerns actions and relationships. However, the HRA is concerned with statements and words and these are far more difficult to characterize. For example, the NSA asks questions such as: "Did your client see the other person at hospital before the injuries were treated or the dead body cleaned up?"98 In contrast, examples of the kind of questions which cause problems for the HRA are the following: "Did the declarant believe she was dying?" and "Did she qualify this belief in any way so as to indicate that it was not an absolute belief?" Both these questions rely on the user for an evaluation of the meaning of the declarant's statement. This introduces a source of error that the NSA does not have to deal with. The form of conclusions for these two systems is quite different. For example, the NSA answers the question: "Does this user have a cause of action in nervous shock?" For the law to recognize a cause of action, the plaintiff must establish certain elements of her or his case. Thus it is reasonable and helpful for the NSA to present its conclusions in terms of a breakdown of the elements of the cause of action. The HRA answers the question: "Is this piece of evidence admissible in court?" Therefore, it seems most useful for the program to present its conclusions in terms of a summary of the conditions which must be met for the evidence to be admissible together with an indication of which ones were met on the user's facts and which were missing (if any).99 Beyond these differences the two systems appear similar because they both use M.l. Thus, both interact with the user in the same question and answer format and provide the same kind of explanations for their reasoning. Both rely on if-then rules and back-chaining. The structure of both can be represented as a tree-like diagram. 55 The differences between the NSA and HRA are a direct result of the differences between the law of torts and evidence law. Otherwise, the same theoretical approach and methodology has been successful for both systems. 2. Legal Expert Systems and Legal Theory Given the basic problem that legal theorists all disagree on the nature and description of the law, one may adopt either of two attitudes to its solution. One approach is simply to choose a particular theory and implement that. As mentioned earlier in this paper, to be suitable as a basis for a legal expert system, that theory must provide a rule-governed basis for predicting the outcomes of all the cases that the system is to address. This is the approach taken in developing both the NSA and the HRA. In the alternative, one may search for the points on which most of the legal theorists agree and build a theory on which to base an expert system out of those points. This theory also must provide for solution of cases in a rule-governed way. This is the approach taken by the Oxford Project in Scottish Divorce Law. Susskind decides that the shared view among legal theorists is that clear cases are rule-governed. Solution of hard cases becomes the contentious issue. According to Susskind ... [T]heorists do seem to agree on the forms of legal argument that are both possible and desirable in the clearest of cases, although this unanimity may not be apparent from the literature because 'hard cases' and not 'crystal clear cases' have invariably been jurists' object of study. If there is such a concurrence of approach in relation to legal reasoning as well as to legal theory in general, then it is a model culled from that harmony that should be implemented in expert systems in law. If there is not, and if these conflicts affect the expert system enterprise, then a model that clashes as little as possible with the widely accepted theories should be developed.100 Elsewhere he states that an expert system based on a contentious legal theory may be, for that reason alone, unacceptable.101 56 Having decided that 'clear cases are rule-governed' is the consensus, Susskind then goes on to develop the form of rules that should be used in the expert system. For the Oxford Project, these appear as legal rules derived from statutes and cases. They are what Hart has called "primary rules of obligation."102 As is usual in positivist theory, the descriptions of these rules do not include their goals.103 The first thing to point out is that Susskind's consensus with respect to clear cases neglects the position of many legal realists. Perhaps no such consensus really exists. Note, however, that Susskind specifies that the vast majority of the jurisprudential writing he surveyed was made up of British writings of the last 25 to 30 years.104 This sampling is biased towards positivism. Secondly, a system based on this consensus would be of little practical use. There are few legal domains which are so complex for the average practitioner that she or he would need expert help to work on a clear case. This is especially true of the law of evidence since all lawyers are familiar with evidentiary rules. Therefore, a system dealing with any part of the law of evidence will make no contribution unless it can solve hard cases.105 Thirdly, by accepting that an expert system cannot solve hard cases, one must implicitly accept that hard cases are not decided in a rule-governed way. This is itself a contentious theory. Fourthly, having decided that the consensus points to rules deciding clear cases, the form of legal rules implemented in the Oxford Project is itself controversial since it ignores goals. Not all legal theorists would agree with that position.106 Lastly, an expert system which implements a contentious theory and purports to solve hard cases is not, per se, unacceptable. If the system can account for most or all of the decided cases, that is some proof of the value of the underlying theory. Essentially, if an expert system based on a contentious theory works, then some 57 contribution is made to our understanding of legal reasoning. Great discoveries are never made by following the tried and true and avoiding the contentious. As much as jurisprudence can contribute to the development of legal expert systems, expert systems can contribute to jurisprudence. According to Susskind, it is what he calls law-statements which should form the core of an expert system.107 Law-statements are single sentences or groups of sentences which state what the law is. They are derived from law-formulations found in printed case reports and legislation. "Law-formulations" refers to what would be familiar to lawyers as primary legal materials.108 Thus a law-statement would be a summary of a statutory provision or it would be the ratio decidendi of a leading precedent or a summary of the ratios of a group of cases. For example, the statute law-formulation: In an action for divorce the court may grant decree of divorce if, but only if, it is established in accordance with the following provisions of this Act that the marriage has broken down irretrievably. becomes the statute law-statement: // and only if it is established that the marriage has broken down irretrievably, then the court may grant decree of divorce.109 (original emphasis) The latter rule is in an appropriate form for processing by an inference engine in an expert system. This approach is what has been characterized above as the doctrine-based approach. That is, the rules which end up in the system are recognizable as legal rules. The system focusses on the law and, overall, tends to explain what is the law. The first part of the HRA, dealing with the definition of hearsay, also takes the doctrine-based approach. It uses a proposed legal definition of hearsay and the legal concept of relevance. Some problems with the doctrine-based approach have been referred to earlier but it is worth setting them out in a more detailed fashion. The major problem is that such a system cannot solve hard cases. A case is hard because "... [it] doesn't fall under an existing rule of law, or [it] appears to fall under two rules, the application of which would lead to differing or opposing solutions one from the other, or [it] falls clearly under a rule of law, the application of which would produce an irrational result.110 Thus, in a hard case, legal rules are of little use in predicting the outcome. One might go some way towards resolving these problems by providing the system with meta-rules which would control the order in which the legal rules are consulted and applied. However, that would be of no use where the problem is due to a lack of an applicable legal rule. Further, where two legal rules apply and appear to dictate different outcomes, the decision ultimately made by the judge is very much dependant on the facts of the individual case. It may not be possible to predict accurately in advance of knowing any facts which rule should govern. Finally, meta-rules would be of no use where the application of the legal rule leads to an irrational result unless the system can be provided with some means of evaluating whether the result is or is not irrational. Susskind rightly recognizes that his own (doctrine-based) approach cannot solve hard cases.111 In addition, he is correct to point out that a legal domain which contains conflicting legal rules is unsuitable for an expert system of the type he describes.112 This form of legal rule consistently ignores the goal or purpose of the rule. Susskind completely rejects the notion that the purpose of the rule is of any use to an expert system. This is because he feels it would involve finding out the intentions of the makers of the legal rule and also sociological inquiries which could be very time-consuming and expensive.113 Nevertheless, according to Summers, at least three arguments favour the teleological description of legal rules: 1) The law usually cannot be understood without knowing its goals. Conversely, goals by themselves usually have little meaning until a means exists to implement them. 2) Descriptions which refer to goals "are more faithful to the reality of law as a human artifact ... We make laws to accomplish ends. Thus, if a form of law is so ill adapted to its goals that it cannot reliably serve them, we are inclined at least to say it is much less a form of law and, sometimes even to say it is not law at all. ... [W]hen we think of law we ordinarily have in mind something that is as least minimally effective for its purpose. 3) The method of interpretation of the law effectively implements its goals. "In general, the language of a legal precept cannot be appropriately interpreted if divorced from all conceptions of the goals that this precept is to serve."114 Coval and Smith have developed a theory of rights which includes a thesis that legal goals and their ordering can be inferred from the law. They also cite case authority that recognizes this thesis.115 Thus goals can be considered highly relevant in describing legal rules. There is a certain practical problem with a system based on legal rules. Although statutory provisions do provide authoritative wording of their rules, cases do not. Discovering the ratio decidendi of a case can be quite difficult.116 Nevertheless, in most cases there is little controversy.117 A further problem with the doctrine-based approach is that it will produce a system containing several layers of interpretation. Every legal rule must be interpreted in progressively more detail until it is clear enough to be applied to facts. For example, consider the legal rule given above : "If and only if it is established that the marriage has broken down irretrievably, then the court may grant decree of divorce." This will be the top-level rule in the expert system. There must be a second level rule which tells the system when a marriage has broken down irretrievably. This provides several circumstances such as adultery. Now a third 60 level must be present to explain what the law considers adultery to be. More levels may be necessary to develop this concept. Ultimately, the last level must contain the information about specific facts.118 This approach produces a potentially very large system. As well, for any given set of facts, the system may have to progress through every level. Even with heuristic meta-rules to limit the search, the system produced will be huge, cumbersome and inefficient. The contrasting approach to the doctrine-based approach is to consider the problem from the point of view of facts. Such a system would not explain what is the law. Instead it would determine the facts of the particular case and then decide the legal consequences that flow from those facts by comparing them to the decided cases. At the heart of such a system is the doctrine of precedent which can be stated as the following modus ponens rule: If it is the case that in any judgment made in regard to a particular situation, that a particular person is or is not legally obligated to do a particular act, then it logically follows that anyone in a relevantly similar situation is or is not legally obligated to do the same act. Thus each case or judgment which holds that a particular person has a legal obligation to do a particular act, instances a rule of law which applies to all other persons in a relevantly similar situation.119 In other words, like cases are decided alike. Since like cases are decided alike, the problem becomes one of deciding when cases are alike. This can be viewed as a classification problem. That is, all the previously decided cases make up a set which can be progressively broken down into smaller subsets. The user's facts can be matched against the subsets to find the cases to which they are closest. The legal consequences of the user's facts are then the results in the cases in the smallest subset containing those facts. The goals or purposes of the law in question help provide criteria for determining when cases are similar to each other, that is, when they are relevantly like each other.120 61 For the HRA, the total set of cases is the set of all cases dealing with exceptions to the hearsay rule. This set can be broken down into smaller subsets depending on how many declarants there were, whether the declarants are available to testify or not, whether the statement is oral or written, and whether the action is civil or criminal. The smallest subsets are the sets made up of the individual exceptions to the rule, which further truthfinding. To some extent, this bears a resemblance to the approach taken by L. Thorne McCarty in his TAXMAN project.121 It is more similar to the approach taken by Hafner where she used a classification system to deal with the law of negotiable instruments.122 This approach clearly emphasizes tha facts of the decided cases. In the hearsay cases, all the facts that have to do with the evidence in question have been considered for the case analysis. It has even been necessary in one case to go behind the judgment to the record in search of facts. This action is controversial. At least with respect to finding the ratio decidendi of a case, one author has argued that it is only the facts which the judge considered material that should be used.123 It has also been pointed out that the judge makes a selection of a set of material facts out of a myriad of possible combinations.124 Nevertheless, with hearsay cases it has proved to be impossible to reconcile the decisions unless all available facts are considered. It may even be necessary to draw reasonable inferences as to other facts (as, for instance, the implication that there was other evidence in the case on the same issue). Since, for purposes of expert systems, all facts must be regarded as important in order to explain the majority of hearsay cases, perhaps judges are influenced by more facts than they emphasize in their judgments. L. Thorne McCarty has argued that "the most critical task in the development of an intelligent legal information system, either for document retrieval or for expert advice, is the construction of a conceptual model of the relevant legal domain."126 Unfortunately, he does not define exactly what he means by this, but 62 chooses instead to use his own TAXMAN system as an example of a deep conceptual model in a legal domain.126 The word "conceptual" is not very useful. There are strictly legal (doctrinal) concepts as well as other kinds. A legal concept may be no more suitable for incorporation into an expert system than a legal rule. For example, the definition of hearsay requires an analysis of relevance. The legal concept of relevance can be used in an expert system but it does not produce a system that is really an expert. Such a system cannot solve hard cases. Relevance must be further developed into a deep structure, fact-oriented analysis. Ideally the doctrine-based approach and the fact-oriented approach should yield the same result for a given set of facts in clear cases. However, the former cannot solve hard cases. Further, unless a statutory area of law is very small and self-contained, the doctrine-based approach will yield a huge and cumbersome system that will take a very long time to build. The fact-oriented approach is especially suitable for case law which may show two conflicting legal rules being used by two distinct lines of cases. Added to this is the frequent lack of clear statement of any legal rule together with the problem of determining the ratio of a case. The doctrine-based approach is impossible in this situation. The purpose of these two approaches is quite different. The doctrine-based approach yields a system which primarily explains or teaches the law. It requires legal expertise to build it but this is a widely applicable kind of legal expertise. It involves expertise in finding the ratio decidendi of a case and in statutory interpretation. It could also require expertise in legal reasoning and jurisprudence. By contrast, the fact-oriented approach is probably less useful as a teacher as its aim is to give a specific answer supported by case authority. Building it involves highly specialized legal expertise. The former system performs essentially the same function as some teachers of law. It says "The law of x is y". The latter system 63 performs the function of the experienced lawyer who responds to facts given by an inexperienced friend with: "It sounds to me like you've got a case of x." The latter system, if it supports its opinion with case authority, is by far the better research tool. Having set out the advantages of the fact-oriented approach to this point, it should be noted that it too is subject to some limitations and qualifications. The main sources of hard cases in the law are open-texture of legal concepts and rules and vagueness of legal words. Open texture was introduced to jurisprudence by H. L. A. Hart and his explanation remains the clearest. If we are to communicate with each other at all, and if, as in the most elementary form of law, we are to express our intentions that a certain type of behaviour be regulated by rules, then the general words we use ... must have some standard instance in which no doubts are felt about its application. There must be a core of settled meaning, but there will be, as well, a penumbra of debatable cases in which words are neither obviously applicable nor obviously ruled out. These cases will each have some features in common with the standard case; they will lack others or be accompanied by features not present in the standard case. ... We may call the problems which arise outside the hard core of standard instances or settled meaning 'problems of the penumbra'; they are always with us whether in relation to such trivial things as the regulation of the use of the public park or in relation to the multidimensional generalities of a constitution.127 MacCormick suggests that vagueness is something over and above open texture and gives the example of the legal term "reasonableness", a term which any lawyer would agree imposes a vague legal standard.128 Even though a legal rule contains no such vague word as reasonableness it will still have open texture. According to Susskind, these two problems have the same consequences for expert systems and he classifies them together as semantic indeterminacy}29 It can be seen from Hart's description that any rule or concept is beset with open texture problems. Likewise, any word, whether a legal term or not, may be vague. Vagueness of legal words is something which can be at least partly avoided. For example, if a statute allows an exception to the hearsay rule where a document has been created in the "usual and ordinary course of business," it is preferable to analyse the cases to find a concept or set of facts which can be expressed more precisely and which means the same thing.130 The more precisely formulated concept is much easier to use in an expert system and is probably something that the user can be asked about if necessary. Nevertheless, vagueness can be found even in words used in an ordinary sense and in the questions which the system asks the user. This problem can be minimized with careful phrasing of questions and use of words. As well, the system will indirectly indicate that there is a problem if a user reaches an incorrect result because of giving a wrong answer to a question that seemed vague. The cases that the system produces in the end as being on point with the user's facts will be clearly wrong. Thus the user will be alerted to a problem. This is rather unsatisfactory, nevertheless, it is the only means available to capture this error. Open texture is also a problem for expert systems. Both the NSA and the HRA are based on deep structure heuristic rules. Like any legal rule or concept, these also have a core and a penumbra. For example, on the basis of its deep structure rule, the Nervous Shock Advisor will tell the user that a mother who directly witnesses an accident to her child has a cause of action for nervous shock with a certainty factor of 100% (assuming there was negligence by the party causing the accident, there were symptoms of nervous shock and damages can be proved). In contrast, an uncle in the identical situation has a cause of action with a certainty factor of only 60%.131 Clearly, the mother who sees the accident comes within the core of the rule while the uncle falls in the penumbra. As another example, if a dying declarant states to a witness "I am dying" or "I shall go" and is actually dying at the time of making the statement, the HRA will conclude that this declarant had 65 the required settled hopeless expectation of death to make the dying declaration exception apply. However, using the same circumstances, if the declarant states "I think I am dying," the system will conclude that it is less likely that the declarant had a settled hopeless expectation of death than in the first case.132 In the latter case, the user's facts fall some distance into the penumbra of the concept "settled, hopeless expectation of death." Open texture can be dealt with in the development of these systems. The expert must make decisions about facts which fall in the penumbra of the deep structure rule or concept. This is done by analysing all the cases and attaching some form of probability to the possible outcomes. For example, in the dying declarations exception, type 1 injuries together with one or more other indicators that the declarant believed she or he was dying lead to admissibility in 11 cases but not in 2 cases.133 Thus where there are type 1 injuries and one other circumstance as the only indications of the declarant's belief, the declaration will be admissible with a certainty factor of 85%. If the sample of cases is large enough it may be possible to do some kind of statistical analysis to generate numbers.134 Alternatively, the expert may be able to assign certainty factors on the basis of her or his knowledge and observation of the cases. The certainty factors can be used to alert the user that her or his case falls within the penumbra of the rule. Further problems of the penumbra are dealt with by posing hypothetical cases to the system and adjusting it to produce the result the expert deems most likely correct. This should be the result that orders the goals of that legal domain in the same way they are ordered for all the cases in the database. For the HRA, the appropriate result would be the one which furthers truthfinding. Susskind concludes that open texture and vagueness presently preclude the development of expert systems that can solve hard cases.135 However, the NSA and HRA demonstrate that these problems can be minimized using the deep structure approach. This approach focusses on facts and concepts which can be formulated much more precisely than legal standards and rules. Also, certain problems of open texture can be clearly signaled to the user by qualifying the opinion given with certainty factors. Experts do purport to give opinions even in hard cases. They may qualify their opinions and, sometimes, they may even be wrong. They are not expected to be accurate all the time; they are consulted because they are right more often than non-experts. 3. Case Based Law and Statutes Most of the expert systems in law that have been built to date deal with statutory law and they have seen little success.136 In contrast, the NSA has proved highly successful in case-based law. Thus there may be differences between case-based law and statute that make the former easier to computerize. Hart gives the following illustration to distinguish between legislation on the one hand and case precedent on the other: One father before going to church says to his son, 'Every man and boy must take off his hat on entering a church.' Another baring his head as he enters the church says, 'Look: this is the right way to behave on such occasions.'137 This very simple example shows that legislation lays down a rule in more or less general language while precedent sets an example for future conduct. In general, when using a statute in court, the lawyer must argue for a certain specific interpretation of the general rule. On the contrary, when using favourable precedent the lawyer must argue for a more general interpretation of the specific precedent which also includes his own case.138 Another way of looking at this is to consider the approach to case law as being basically experimental while the approach to statute law is essentially deductive.139 67 Summers, in his book on Fuller's work, finds the following differences between precedent and statute: 1. Statutory texts rarely include justifications or arguments whereas cases do. 2. A great deal of statutory law rests on compromises and is thus not likely to be as principled as case law. 3. Case law is usually retrospective and oriented to facts that have already occurred whereas statutory law often looks mainly to the future. 4. Unlike case law, statutes have explicit authoritative form.140 Other differences to add to this list are: 5. The goals or purposes served by legislation may be quite different than those served by case law.141 6. Statutory law generally undergoes many more extensive changes than case law. This is something to be wary of when choosing to computerize an area of law. The law should be relatively stable and not change significantly over the life of the project since, it takes years to build an expert system, not months. Perhaps it is the fourth item above that has proved to be the major stumbling block in computerizing statute law. Summers and Fuller have more to say about this: In the common law, a judge is not required, as he is in the interpretation of statutes, to try to adhere even in a general way to the language of a precedent ... A precedent ... is something that has purportedly been justified either by reference to previous cases or by direct appeal to substantive reasons, or both. The rule applied and the reasons purporting to justify it are ordinarily 'two aspects of a single reasoning process.' The rule is frequently stated by the deciding court more than once and not always in the same terms. In addition, a judge may misstate what he is actually deciding, and may even fail to articulate his reasons clearly. This is a far cry, indeed, from the 'statutory mode,' in which there is a set and singular form of words with which the court must work. Common law is unwritten law. And it follows that nothing approaching the so-called plain-meaning rule, one basic method of statutory interpretation, could even be remotely credible as an approach to the application and interpretation of precedent. The application of a precedent is inherently something that must be argued for."142 (original emphasis) The approach which is taken to statutes is one of interpretation, and the plain meaning of the words of the statute taken in context appears to be the primary interpretational rule. Added to this, where necessary, is consideration of legislative intention or purpose. As well, judges may have some leeway in interpretation to avoid absurd consequences.143 Thus the authoritative form of a statute leads lawyers to be always concerned with the words of the statute and their interpretation. In contrast, case law provides no such authoritative wording of any rule. The ratio decidendi of a case presents a major problem of discovering and formulating a rule rather than interpreting words. As has been often observed, the ratio of a case can be a very slippery and elusive creature indeed.144 Perhaps it is the case that these differences encourage most lawyers, positivists or realists, to feel bound by statutory words, but presented with a great deal more flexibility with case law. This may be why legal expert systems that deal with statutory law tend to reproduce all the legal rules explicitly as they are spelled out in the legislation, whereas the NSA and HRA, dealing exclusively with case law, contain no such explicit statements of legal rules. The work on the HRA raised an interesting problem of combining statute and case law in a single system. Unlike the Oxford Project, the HRA was not begun with a consideration of a statute, which would constrain a later analysis of case law. Instead it was begun with a consideration of case law. Specifically, the common law exceptions to the hearsay rule were analyzed as examples of a kind of reliability. In due course, it became necessary to consider the statutory business documents exceptions found in s.30 of the Canada Evidence Act145 and s. 48 of the British Columbia Evidence Act.146 Neither of these sections mentions reliability, but the approach to that point, of necessity, constrained analysis of the statutes. They had to be made to fit within the existing framework. Thus, it was necessary to step away from the strict language of the statute and consider it instead as setting out another example of reliability. McCarty has taken a similar approach in TAXMAN. That is, TAXMAN captures the user's facts, a particular corporate transaction, and compares them to statutory provisions which are stored in exactly the same words 147 and concepts. Thus the statutory provision is treated as if it sets out an example of a corporate transaction the same way as we have used the statutory business documents exceptions as an example of reliability. Carole Hafner also takes this approach in her work on negotiable instruments.148 One writer maintains that a court's interpretation of statutes is essentially a matter of classification.149 This experience suggests that statutory provisions can be computerized in the same way as case law. This requires the builders of an expert system to consider that the user's fact situation must be classified into some category of factual situations which the law recognizes. The categories of fact situations can be gleaned from case law fairly easily and from statutory provisions with somewhat more difficulty. As was noted above, case law proceeds naturally on an example by example basis whereas statutes operate through general provisions. Nevertheless, since there are substantial differences between case law and statutory law, legal domains which have a substantial statutory component may be best treated in ways other than the deep structure approach. 4. Legal Expertise and the Expert It is clear from Susskind's book that legal expertise exists in at least two different general forms. First there is the expertise of the nature of law and legal reasoning (jurisprudence), the nature of statute and case law, the construction of statutes, and the discovery of the ratio decidendi. This is a general kind of legal expertise, applicable to all areas of law. Secondly, there is expertise in particular areas of law, the legal consequences of actions, and the procedure of obtaining remedies or imposing sanctions. The first kind of expertise, the generally applicable kind, is what can be clearly seen in the Oxford Project on Scottish Divorce Law. Because it cannot solve hard cases, it cannot be said to be expert in the Scottish Law of Divorce itself. The first kind of expertise had been used in the development of the NSA and HRA but the systems themselves embody the second kind of expertise, specifically, expertise in the law of negligence or in the law of evidence. Any expert who possess the latter, specialized expertise will also have some of the former, generalized expertise. The latter presupposes the former. Susskind recommends that the expert only be involved with the last stage of building the system, the stage he calls "tuning."150 That is, the legal knowledge engineer himself inputs all the basic knowledge and the expert is only brought in to correct mistakes and refine the system. Of course, this is fine if the system being built is to be a faithful copy of the law as it exists in the case reports, statute books, and textbooks. However, if one takes a deep structure approach this will be impossible unless the expert has already developed that approach and written about it clearly enough that a knowledge engineer could take it straight from the book. In the case of the hearsay expert system, the domain expert has been fully involved from the beginning. She selected certain favoured authors from the many who have written about the hearsay rule, synthesized their work, and added a great deal of her own knowledge to produce the present system. Most of the debriefing time has been spent in intensive discussions of law, the hearsay rule, goals, and purposes. When the HRA is finished, it will be a high-quality, accurate and useful system. This would not have been the case had the law of hearsay as it is laid out in existing textbooks merely been copied into the knowledge base. Without the expert's input, this would have been the only option. Nevertheless, although it is clearly preferable to have the expert be involved from the beginning, this may not be necessary if one is an experienced legal 71 knowledge engineer. In other words, one reason the expert had to be involved so intensely with the HRA was because of the inexperience of the knowledge engineer. An experienced knowledge engineer could venture into another area without the close supervision of an expert, if that expert preferred to take on the role of fine tuner. Nevertheless, a much larger role is open to the expert and is to be preferred. Throughout this discussion the expert is always referred to in the singular. The NSA has only one domain expert, as does the hearsay system. The latter has also relied on a second expert but not in hearsay, rather, in jurisprudence and legal theory. Thus one expert system reflects the knowledge of only one domain expert. This expert may use other experts, for instance, writers in the field, but she or he always has the final say. The reason for relying on a single domain expert is that, presumably, experts will rarely agree on many matters. In a case of controversy, it becomes a very difficult decision as to which expert's knowledge should be built into the system. Hoffman suggests that this reasoning assumes that disagreements are pervasive and, further, that the cause of the pervasive disagreement is the uncertain expert knowledge. Therefore, "... in a given domain, one obviously runs the risk of generating a system that is wholly idiosyncratic at best and based on trivial knowledge at worst."151 According to Hoffman, such a domain is probably not suitable for expert systems work and the points of disagreement require further research before such work is undertaken.152 After the discussion above on legal theory, one is left with the impression that disagreement in jurisprudence is indeed pervasive. Nevertheless, the deep structure rules on remoteness developed for the NSA can account for over 90% of the decided cases on remoteness of damages, so the NSA cannot be called idiosyncratic.153 Also, since it produces only the cases that are relevant to the user's facts, it performs in much more than a trivial manner. With regard to the HRA, reliability or intention to tell the truth is well accepted as 72 the requirement for admissibility of hearsay evidence.154 Whatever might be the problems of getting legal theorists to agree on how hard cases should be dealt with, the case analysis and the resulting heuristic rules used by the NSA and HRA are the subject of agreement. Although Hoffman is correct to point out the need for further research in areas of disagreement, the success of the NSA indicates there is no need to wait for these differences to be resolved. Further, there is no suggestion that these differences will be resolved in the near future and, in the meantime, the amount of information lawyers must cope with grows at an alarming rate. Trustworthy, intelligent, automatic legal research tools are needed now. The longer the delay in developing expert systems, the greater will be the problems in creating databases. As mentioned above, the hearsay system has benefited from the expertise in legal theory of Professor J. C. Smith. This suggests the possibility of a team of experts being involved each dealing with a different aspect of the project. Although it has been of great benefit to the hearsay project, this would not always be the case. For instance, had Richard Susskind been the expert in legal theory rather than Professor Smith there would have been little chance of coming up with the system as it now exists. In the first place, Susskind would have encouraged use solely of legal rules without including any consideration of goals. In the second place, there might very well have been an irreconcilable conflict between the legal theory expert and the domain expert. Every expert will approach the domain from some jurisprudential background. A domain expert with a realist orientation would find it impossible to work with the positivist Susskind. Thus, to use more than one expert in building an expert system, one must ensure that they all have a similar jurisprudential bias. 73 5. The Knowledge Engineer Deedman suggests that practicing lawyers may ideally be suited to being knowledge engineers because of their skill at interviewing people, including expert witnesses.155 However, when a lawyer interviews an expert witness, the ultimate goal is to use certain testimony by that expert to a client's advantage in a case which the lawyer has control over. Interviewing an expert with a goal of faithfully reproducing her or his knowledge, so it can be used as a reference by other people in the same field, is quite a different thing. In addition, the lawyer as legal knowledge engineer may bring a bias to the process and add to or distort the expert's knowledge. This would be especially true for evidence law, where every lawyer would claim to know something of this domain. Nevertheless, a lawyer may still be the best legal knowledge engineer. If legal knowledge and legal reasoning are significantly different from other kinds of knowledge as is claimed,156 then the lawyer will be able to understand and communicate with the legal expert where the non-legally trained may not. Knowledge engineering requires two quite separate skills: 1) interviewing or debriefing the expert, and 2) using that expert's knowledge in building a computer system that can solve the kinds of problems the expert is called upon to solve. This latter function seems rather like that of a ghost writer. There is some literature on how knowledge engineers should go about their task, but, on the whole, it is not very useful.157 Although the process of debriefing the expert for the HRA was begun with a structured approach somewhat like that recommended (that is, use of example problems), this was abandoned fairly early on as neither party to the exercise seemed very comfortable with it. It proved much more useful to have an informal, unstructured discussion centered around a topic chosen in advance. It is hard to recommend any particular method because it seems that it is highly dependant on the personalities involved in the process. Although it 74 is helpful to start with a planned method, so that time is not wasted wandering aimlessly at the start, the relationship between the two parties will at some point settle out to a procedure that suits the particular situation. It follows from this, that training as a legal knowledge engineer with regard to debriefing the expert is not possible, and that learning by experience is the only way. Of course, training with regard to the computational aspects of knowledge engineering would be very useful. Nevertheless, building the HRA has generated a few points which it might be helpful to pass on to future legal knowledge engineers: 1. The goals of the system should be defined at the beginning of the debriefing sessions. 2. The subject area should be divided up into small components and priorized. Whatever area of law is picked for computerization is bound to turn out to be too large to handle in the time allotted. 3. Regular meetings should be held to keep momentum going for both parties. 4. The material to be covered in the meeting should be decided beforehand so that both parties can come prepared. 5. It should be possible to get some sort of system up and running on the computer early so that both parties have the morale boost of seeing their efforts bear fruit. 6. Nevertheless, above all, flexibility must be maintained and the system must be changed should the expert so desire. 7. Meetings should be kept to one to one and a half hours since they are very intensive sessions and can be quite tiring. 8. Remember the expert always has the final say and must make all judgment calls since it is her or his expertise going into the system, not that of the knowledge engineer. 75 6. The Inference Engine The inference engine is the computer software which interacts with the user and which reasons in accordance with the rules in the knowledge base to reach a solution for the user's case. The "distinguishing characteristic of the overwhelming majority of the [existing] systems is their dependence on deductive inference procedures."158 As an example, consider M.1, the software used to build the HRA. The knowledge base of an M.1 system consists of a series of questions and if-then rules. The program identifies the top-level goal for the system and then searches through the rules until it finds a rule which has that top-level goal as its conclusion. It then takes the premiss of that rule as the next goal to search for and it repeats the procedure. This continues until it encounters a premiss (sub-goal) for which there is no rule but there is a question. At this point it asks the user the appropriate question. This allows it to finally satisfy that chain of reasoning and to return to the next premiss that must be satisfied. This continues until all premisses (sub-goals) are satisfied and the system can make a final determination of the top-level goal. This process is called back-chaining. In his book, Susskind gives a thorough analysis of the place of deductive logic in legal reasoning.160 He also compares the thought processes through which lawyers proceed when giving legal advice to back-chaining and to forward-chaining.161 (Forward-chaining is the reverse of back-chaining, i.e.- a forward-chaining system reasons from premisses to conclusions not vice versa.) Deedman also compares lawyers' reasoning with back-chaining.162 Presumably this is done because there is a perceived need to justify the method of reasoning chosen for the use of the computer in terms of comparing it to actual human thought processes. However, this is a mistake. According to Roger Schank, expert systems are not a model of cognitive processes.163 Therefore, there is no need to compare their reasoning processes with those of human beings. As a simple example, consider the problem of finding the median of a set of numbers. A human being would consult an elementary statistics book to obtain a formula and do a calculation on the numbers.164 However, a more accurate way to find the median is to sort through the numbers to find the number in the middle. This sort procedure involves partitioning the set of numbers and comparing them to each other. For a large set of numbers this would be a long and repetitive process that a human would not undertake. In contrast, this is a task well-suited for a computer.165 The solution reached is the same, the underlying theory is the same but the mechanics of solving the problem differs. In this example, the computer does not model human reasoning. There is work being done in artificial intelligence which does attempt to model human cognitive processes. In the legal field, this includes work on legal argument, computer learning, and reasoning by analogy.166 This research is of general application whereas expert systems work is limited strictly to the legal domains involved. Expert systems work should be kept distinct from this more generally applicable work in artificial intelligence. In the case of the NSA and HRA, the systems really work by case-based reasoning by analogy and (for the latter) by drawing inferences from evidence to facts in issue. The computer accomplishes this task in the best way it can, not necessarily in the same way a human can do it. As far as expert systems are concerned, it is human judgment in the building of the system that is important. "... [I]n every interaction with these systems there must be some ultimate human judgment without which the system is unable to offer any guidance."167 As long as the answer is what the expert would give, it does not matter how it is arrived at. 77 It follows from this, that there is no need to build one's own inference engine, nor to select a commercial shell on the basis of how close it seems to come to what is imagined to be legal reasoning. A rule-based, deductive commercial shell such as M.1 is quite sufficient for legal expert systems. There are better criteria than mode of reasoning to use in selecting a commercial shell. For instance, two aspects of building an expert system which have proved problematical with Ml, are control of the screen display and connection with an external database. It would be wonderful if the knowledge engineer could have the ability to tailor the screen display to suit her or his own purposes and to be as friendly as possible for the user. Likewise, it would be very helpful if the shell included links to external databases. Using these criteria, it is clear that M.1 has limitations. Its great advantage is its low cost and the short time needed to learn to use it. Also, a knowledge base which M.1 can use can be created with a standard word processing program. A simple prototype system can be up and running quickly. However, M.1 gives the knowledge engineer no control whatsoever over the screen display and it has proved rather difficult (though not impossible) to connect it to an external database. Its reliance on deductive reasoning and back-chaining is not a problem since these computational aspects of the system have nothing to tell us about legal reasoning. 7. The User Interface The decision as to who will be the user of a legal expert system and what background this person will have should be made at the beginning of the whole process. For the NSA and HRA the assumption is that the user will be a legally trained person. On that basis, a certain level of knowledge may be presupposed. The task of building the system is thus constrained and kept to much more manageable limits. The HRA assumes that the user can properly organize the evidence in her or his case and will recognize some legal terms. If this were not so, there would be tremendous problems in organizing the case through the computer system. The system would become huge and require years of effort to build. Further, Susskind suggests that users must be aware of general principles of law that cannot be computerized.168 Thus users must have some legal training. He states that expert systems of this kind "... are for use by general legal practioners: they will use the systems as powerful research tools which will be capable of drawing legal conclusions in specialized areas with which these users have no familiarity."169 There is some choice available in terms of user interfaces, that is, how the program accepts input from the user.170 Some legal expert systems require the user to enter information in a restricted subset of English or in some kind of computer language. Neither of these options is useful because of the need for the user to learn a great deal just to use the system, particularly in the latter case. A user should be able to sit right down at the computer and begin working with it with no preliminary effort. All the time spent at the computer should be fruitful. The ideal option would be to have the user type in the facts in English, but until natural language processing becomes practical, that option is unavailable. Thus, present legal expert systems must use an interactive format in which the program asks questions of the user and the user answers with "yes", "no" or a choice from a specified list. This is the format which M.1 uses. It seems to work well and might only run into problems if the user has to answer a very long set of questions, thus getting tired and bored before the end. Both the NSA and HRA work quickly and ask relatively few questions. The expert system not only has to gather information from the user, it also must display its conclusions. Ideally, the program should allow the expert and the knowledge engineer to design the form of conclusions to suit their domain. The form and content of conclusions will vary with the legal domain being computerized and with the preferences of the expert. Most expert systems have built in to them some means, however rudimentary, of explaining their reasoning. That is, they can explain to the user why they have asked a particular question. This is called transparency of the system.171 Transparency is achieved either by reproducing the rule the system is considering at the time or by providing a prepared text. M.1 will respond to the question "why?" by either of these means. For the NSA and HRA, prepared texts have been substituted for the explanations the systems give. The rules in the knowledge base are not suitable for presentation to a user since they cannot be easily understood. Friendliness of the system is an important aim of this work. For legal expert systems, the program should also produce legal authority for any conclusions that it draws. Lawyers would rightly be reluctant to accept the results if they had no idea how those results were achieved. Bare conclusions are not persuasive in court. Case authority can be produced throughout the consultation in response to the user's demands, or it can be placed at the end of the system to be produced in summary form. For the NSA and HRA, legal authority is made available to the user at the end of the consultation and she or he is given a choice of cases to view. All the cases in the domains have been researched, and these would make far too long and complicated texts if they were all stuffed into the explanations. 8. Databases Databases for legal expert systems are absolutely necessary, because such systems must support their conclusions with legal authority. Existing commercial legal databases provide full text case reports or headnotes.172 Unfortunately, headnotes are not useful for expert systems because they do not contain enough 80 facts. For example, cases which are important for evidence law very often are mostly concerned with another sustantive legal issue. The evidence points get short shrift in the case report and only one sentence in the headnote. Thus the headnote is quite insufficient. It also seems unhelpful to produce the full text of a case for a user when the system only uses a very small portion of it. When a case is of interest only for a single issue, a summary of that case with respect to that issue alone is far more useful than the headnote or the full text. Thus, even if access to a commercial legal database through expert systems was available, it would probably be of little use. Therefore, a database, which collects cases that are relevant to the particular domain and are summarized from the point of view of that domain, must be created for an expert system. This requires a tremendous amount of work researching cases and entering them in a database. An automatic information scanner might help if it could copy handwritten case summaries. Also, if work is being done in another legal domain where full texts are indeed useful, and there is storage available for vast amounts of data, automatic entry of full texts might be possible. But until advances are made in storage capacity for personal computers and in automated reading of texts, database construction must rely on student researchers to brief cases and to enter them in the computer. A large part of the actual cost of a legal expert system will be spent on creating a database. Finally, at some point as yet far in the future, the ultimate goal in the best of all words would be a computer which could read and interpret the cases itself.173 If possible, databases which are capable of standing alone should be built. Thus, if the knowledge base in the system itself requires a major overhaul, the database can still be used independantly. For the HRA, the intention is for the case file (not the profile sheet files) to be available as a stand-alone database. 81 9. Evaluation of the Expert System There are a number of possible features of the system to look at when considering its evaluation. First, the system will present an appearance of intelligence during the consultation with the user. The questions will appear in appropriate order and context. For instance, if the system has already found out that the hearsay declaration under consideration is a dying declaration it makes no sense for it to ask the user about business documents. Secondly, the system will have a minimum number of dead-end questions; questions which end up giving little or no useful information if the user answers in the negative are minimized. Thus, chains of questions should be as short as possible and questions should be as widely applicable as possible. So, for the hearsay system, one question should do for as many exceptions as possible. Thirdly, the system will be fairly easily changed. Not only should the database be built in such a manner as to simplify the addition of new cases, but also the knowledge base should be easily adapted if there is a change in the law. Fourthly, the system can be compared with test cases. For the hearsay system, a random 20% of the total cases researched for each exception have been kept out of consideration in building the system. These cases can be used as test cases. The system evaluates the test case and the result is compared with the actual result in the case. Fifthly, the system can be tested with hypothetical cases. In such a case, the correct answer to which the system's answer should be compared is the answer the expert would give to that hypothetical. Finally, acceptance and use by the legal community must be the ultimate test and perhaps the most valuable. The system will not be accepted if, over time, it appears to be unreliable. It will also not be used if users find it too difficult or confusing. Thus, the final acid test must be acceptance by the community. 82 Chapter 5 CONCLUSION 1. Review At this point the questions set out in the first chapter can be answered. For the first question, admissibility of hearsay evidence can be fairly accurately predicted on the basis of an evaluation of its reliability and consideration of whether or not it is the best evidence. Reliability of a hearsay statement is found where there are circumstances to show that there was perception, belief and intention to tell the truth on the part of the declarant with respect to the matters reported in the statement. The success of this theory in explaining the cases suggests that judges actually may be doing a subconscious evaluation of reliability. For the second question, this theory of reliability can indeed be used as the basis for an expert system on hearsay evidence. The result is a system which classifies hearsay statements into classes containing examples of reliability. This yields a simple, efficient, straightforward system that can solve hard cases. Susskind sets out five criteria for choosing an area of law for computerization. These are as follows: 1. In so far as any area of law is self-contained, it is desirable that the chosen domain be relatively autonomous, the sources being limited in number and reasonably well defined. ... 2. The chosen legal domain must be one whose problems do indeed require expertise, and not simply brief research, for their resolution. ... 3. Intensive coverage of a small legal domain is preferable to superficial coverage of an extensive area of law. ... 4. A domain in which there is agreement amongst experts over its scope and content is to be preferred to one in which there is no such consensus. ... In this connection, it should be added that legal domains in which there seem to be many conflicting legal rules — valid rules that for the same set of circumstances seem to ditate diverging legal consequences ~ are unsuitable domains of application. 5. Legal domains whose problems for their resolution require the use of a great deal of 'common-sense' knowledge are unsuitable for expert systems work.174 Susskind's first point is well taken as this was exactly what has caused problems with the first part of the HRA. The definition of hearsay is not autonomous but leads directly to a consideration of relevance. The second point, of course, is necessary if one goal of the system is to advise lawyers. It follows that the system must be built in an area in which lawyers do need advice. Nevertheless, if the goal is to test a theory of law, this may not be a concern. The third point is also necessary if the system is to be useful to lawyers. Lawyers would more likely find a deep system of use than a shallow one. The former exhibits the expertise they need to consult whereas the latter does not. Also, for purposes of testing particular theories and concepts, a shallow system could never give adequate evaluation of the broad and generalized theory that must lie behind it. The fourth point is controversial. A deep structure analysis can resolve conflicting lines of legal authority. As for the fifth point, it may be quite possible to find some explanation of what appear to be common sense terms through a detailed deep structure analysis of the cases. To Susskind's criteria should be added a sixth: The area chosen to be computerized should be relatively stable and changing slowly (as, for example, most case law). It takes two or more years to build an expert system and it would be tragic to spend the time building the system only to have the law undergo such a complete change that all the work is invalidated. 2. Overview of Expert Systems in Case-Based Law An expert system in case-based law will focus on a very narrow domain in law in order that it can display expertise and be built within a reasonable time frame. The field of law chosen will be changing only slowly if at all. Changes in the law instantly may render a large portion of a legal expert system obsolete. 84 The system will rely on a deep structure model which accounts for a maximum set of case results and will take a fact-oriented approach. The opposite approach, the doctrine-based approach, produces a cumbersome, inefficient system which cannot solve hard cases. These systems will function primarily as intelligent research tools, thus, they must back up their conclusions with legal authority. Interaction with the user will be handled primarily with a question and answer format as this is the only reasonable and user friendly option available given the present state of computer techonology. The system will provide explanations of its reasoning during the consultation. Without this, it would not appear friendly to the end user and may not be acceptable at all, since users cannot evaluate the reliability of the system if they cannot understand its reasoning. 3. Issues for Future Consideration There are several issues that remain to be addressed in this context. Evaluation of these systems has been briefly touched on above but it requires more study. In particular, how these systems should be made available to the legal community in order to obtain an evaluation needs consideration. Also, once the system is available for use, perhaps records should be kept of who uses the system, the advice the system gives and the ultimate disposition of the case. This would provide ongoing evaluation. It also would be useful to consider some problems with respect to databases, for example: whether summaries of cases alone are adequate, whether judges should be rigorously quoted, and when cases can be disregarded because of insufficient facts in the case report or because they appear to be wrongly decided. Another potential issue with respect to these systems is liability. If the systems are considered and used in the same way as legal textbooks there would be no liability. However, these systems do purport to render legal advice. That feature together with the tremendous time savings in research which they make available suggests that lawyers may rely on these systems to a greater extent than if they were only textbooks. There will always be the pressure of economy, to save time and factor inputs, especially when these are in quite short supply. There will also be the even greater pressure of man's inclination to accept seemingly concrete answers and to avoid responsibility for hard decisions, especially when escape from such pressure appears to be found in computer-based answers and decisions. The computer is thus easily converted into an anxiety-relieving deus ex machina, whose utterances are susceptible of being interpreted as those of Allah.175 If the system is mistaken, could its builders be open to a malpractice suit?176 Conversely, if the system proves to be highly reliable and an individual lawyer does not consult it, could that lawyer be liable in negligence for not preparing the best case possible for a client?177 A third issue to consider is the effect of this technology on the development of the law. Because these systems give a kind of legal advice, if they are extensively consulted and relied upon they could have a unifying impact on the development of the law. If the NSA is made widely available and remains the only such system, then the law of nervous shock in British Columbia becomes the law of nervous shock according to Professor J.C. Smith. Although in any legal domain only a small handful of experts dominate the legal textbooks, still the expert system can greatly magnify the impact of a single expert. As these systems are presently just being built, a unique opportunity exists to observe the impact of this technology on the development of the law. In conclusion, expert systems are a viable prospect for law. Not only do they greatly facilitate legal research for practicing lawyers, but they also afford legal theorists unique opportunities to test their theories. Samek maintains that [t]he intuitive hunch of a good lawyer can never be embodied in a computer programme, but it can in many cases be exercised more effectively with its help. ... Artificial intelligence can greatly enhance natural intelligence as long as it is not misused to replace it.178 Back in 1966, Lord Bowden had the following to say in the House of Lords: The work is just beginning. The tasks will be vexatious; they will be difficult; they will be time-consuming. I can assure any man who tries to do it that there will be moments of despair when he will decide that the prospect is hopeless; but I can also assure him that this is the common lot of people who use computers, and in the end these machines' extreme speed and fantastic ability to reason and to process data will come to his aid. I feel certain that this is the kind of enterprise in which the application of this vastly powerful technique to an ancient and traditional society will be extremely advantageous. These words are still applicable today. Future developments should be encouraged and will be eagerly awaited. 87 NOTES Chapter 1. INTRODUCTION TO EXPERT SYSTEMS IN LAW 1 Paul Harmon and David King, Expert Systems, (New York: John Wiley and Sons, 1985), 32-33. 2 Ibid., 5. 3 Constantin V. Negoita, Expert Systems and Fuzzy Systems (Menlo Park, Calif.: The Benjamin/Cummings Publishing Company, Inc., 1985), 1. 4 Richard Susskind, Expert Systems in Law (Oxford: Clarendon Press, 1987), 9. 5 Harmon and King, Expert Systems, 260. 6 Ibid. 7 Susskind, Expert Systems in Law, 91. 8 Negoita, Expert Systems and Fuzzy Systems, 69. 9 Lawrence Tribe, "Triangulating Hearsay," Harvard Law Review 87 (1974): 957. This device lays out the factors which make hearsay evidence unreliable on a simple geometric construct. Using it, the exceptions can be shown to have characteristics that reduce the importance of those factors or remove them entirely thus making the evidence seem more reliable. 10 See Harmon and King, Expert Systems, 263. 11 Also see ibid., 261. 12 Susskind, Expert Systems in Law, 13. 13 Ibid., 19. 14 Some of the successes and partial successes to note follow: According to Michael Heather, James Sprowl's ABF Processor has been used for drafting of wills. Heather also mentions a British system which operated for a short time in pilot projects but which did not last because of the problems of continually updating it. (Michael A. Heather, "A Demand-Driven Model for the Design of a 'Half-Intelligent' Common Law Information Retrieval System," in Computing Power and Legal Reasoning, ed. Charles Walter, [St. Paul, Minn.: West Publishing Co., 1985], 75.) Sprowl's report of his work can be found in James A. Sprowl, "The Automated Assembly of Legal Documents," in Computer Science and Law, ed. Bryan Niblett, (Cambridge, England: Cambridge University Press, 1980), 195-205. Richard Susskind's own Latent Damage Advisor is now being marketed in Britain (personal communication from Richard E. Susskind to Professor J. C. Smith, 1988). Of course, Nervous Shock Advisor (Vancouver, B.C.: University of British Columbia, Law and Computers Project) built by Deedman and Smith of the Law Faculty is also a great success. 88 16 Michael G. Dyer and Margot Flowers, "Toward Automating Legal Expertise," in Computing Power and Legal Reasoning, ed. Charles Walter, (St. Paul, Minn.: West Publishing Co., 1985), 50. 16 Susskind, Expert Systems in Law, 19. The other enterprises to which he refers are the most famous of expert systems - MYCIN, PROSPECTOR and DENDRAL. All three have been well received. They were built at Stanford University between 1965 and 1980. See Harmon and King, Expert Systems, 15-21, 134-135, 145-150 for descriptions of these three systems. Also see Bruce G. Buchanan and Edward H. Shortliffe, eds., Rule-Based Expert Systems (Reading, Mass.: Addison-Wesley Publishing Company, 1984). 17 Susskind, Expert Systems in Law, 74. 18 Michael A. Heather, "Future Generation Computer Systems in the Service of the Law," in Automated Analysis of Legal Texts, ed. Antonio A. Martino and Fiorenza Socci Natali, (New York: Elsevier Science Publishing Co., 1986), 647. 19 Heather, "A Demand-Driven Model," 75. 20 See G. C. Deedman, "Building Rule-Based Expert Systems in Case-Based Law," (LLM Thesis, University of British Columbia, 1987), 17 and R. A. Samek, "Language, Communication, Computers and the Law," Dalhousie Law Journal 9 (1985): 211. 21 Dyer and Flowers, "Toward Automating Legal Expertise," 50. 22 Jurgen W. Goebel and Reinhard Schmalz, "Problems of Applying Legal Expert System in Legal Practice," in Automated Analysis of Legal Texts, ed. Antonio A. Martino and Fiorenza Socci Natali, (New York: Elsevier Science Publication Co., 1986), 614. 23 Anne von der Leith Gardner, An Artificial Intelligence Approach to Legal Reasoning, (Cambridge, Mass.: MIT Press, 1987). The computer system built by Gardner solves clear cases in contract law. 24 Harmon and King, Expert Systems, 257. 25 Susskind, Expert Systems in Law, 19, 48-49. 26 Towne v. Eisner (1918), 245 U.S. 418, 425; 38 Sup. Ct. 158; 62 L. Ed. 372. 27 Webster's Third New International Dictionary, s. v. "heuristic." 28 Aurel David, "Heuristics," in Automated Analysis of Legal Texts, ed. Antonio A. Martino and Fiorenza Socci Natali, (New York: Elsevier Scientific Publications Co., 1986), 63. 29 See for example, Susskind, Expert Systems in Law, 163-203 and Gardner, Artificial Intelligence Approach to Legal Reasoning, 30-31. Also see Chaim Perelman, Justice, Law and Argument, (Dordrecht, Holland: D. Reidel Publishing Company, 1980), 125-135; Edward H. Levi, An Introduction to Legal Reasoning, (Chicago: University of Chicago Press, 1948), 8-9; D. H. J. Hermann, "Legal Reasoning as Argumentation," Northern Kentucky Law Review 12 (1985): 469. 89 30 MYCIN diagnoses bacterial diseases of the blood and recommends treatments. The MYCIN project is described in Buchanan and Shortliffe, Rule-Based Expert Systems. 31 The Oxford Project on the Scottish Law of Divorce is reported by Susskind in his book, Expert Systems in Law. 32 Randall Davis and Jonathan J. King, "The Origin of Rule-Based Systems in Al," in Bruce G. Buchanan and Edward H. Shortliffe, eds., Rule-Based Expert Systems (Reading, Mass.: Addison-Wesley Publishing Company, 1984), 48. 34 Susskind, Expert Systems in Law, 217. 35 This work is explained in Deedman, "Building Rule-Based Expert Systems" and J. C. Smith and G. C. Deedman, "The Application of Expert Systems Technology to Case-Based Law," in The First International Conference on Artificial Intelligence and Law, Proceedings of the Conference (Boston, Mass.: The Center for Law and Computer Science, Northeastern University, 1987), 84-93. 36 Susskind, Expert Systems in Law, 16-18. To those listed by Susskind should be added his own Latent Damage Advisor, now available commercially in England from Butterworths (personal communication by R. Susskind to Prof. J.C. Smith) and EVIDENT (see Jay Liebowitz and Janet S. Zeide, "EVIDENT: An Expert System Prototype for Helping the Law Student Learn Admissibility of Evidence Under the Federal Rules," Computers in Education 11 (1987): 113.). Since a great deal of work is being done in this area there may well be others in existence. 37 Susskind, Expert Systems in Law, 20. 38 Ibid., 44. 39 See Ronald Dworkin, Taking Rights Seriously (Cambridge, Mass.: Harvard University Press, 1977), 15-16. 40 Ibid., 17. 41 For an excellent review of the development of American legal realism see R. S. Summers, "Pragmatic Instrumentalism in Twentieth Century American Legal Thought," Cornell Law Review 66 (1981): 861. Note that the modern version of realism, critical legal studies, holds that legal decisions are never rule-governed. For example, see J. W. Van Doren, "Impasse: Is there a Beyond?" Western State University Law Review 13 (1986): 493-512. 42 Summers, "Pragmatic Instrumentalism," 863. 43 An example of a principle is 'a man shall not profit from his own wrong.' This principle was applied in the famous case of Riggs v. Palmer (1889), 115 N.Y. 506, 22 N.E. 188 (C.A.). See Dworkin, Taking Rights Seriously, 23-39. Deedman, "Building Rule-Based Expert Systems," 33-34. 90 45 Ibid., 38-39. 46 Susskind, Expert Systems in Law, 246. Also see T. W. Bechtler, "American legal realism revaluated," in Law in a Social Context, ed. T. W. Bechtler, (The Netherlands: Kluwer, 1978), 25. In describing rule skepticism, Bechtler states: "Rule sceptics distinguished three kinds of rules: the basic distinction is between normative rules for doing something, and empirical rules of doing something. A judicial decision can be considered as a fact and further divided into the rules which the judge announces in the written decision and the rule which is implied in what the judge does. Someone like a lawyer engaged in predicting future decisions will achieve the best result in extrapolating from the rules which underlie what the judge does and not what he says. ..." Also see Note, "The Theoretical Foundation of the Hearsay Rules," Harvard Law Review 93 (1980): 1811. 47 S. C. Coval and J. C. Smith, Law and its Presuppositions, (London: Routledge & Kegan Paul, 1986). 48 Ibid., 28. 49 Ibid., 45. 50 Ibid., 46-53. In contrast, Hart would choose the a goal (form) over the b goal (substance) while Fuller would make the opposite choice. See H. L. A. Hart, "Positivism and the Separation of Law and Morals" in Essays in Jurisprudence and Philosophy, ed. H. L. A. Hart, (Oxford: Clarendon Press, 1983), 49-87 and Lon Fuller, "Positivism and Fidelity to Law," Harvard Law Review 71 (1958): 630-672. 51 Coval and Smith, Law and its Presuppositions, 52. 52 They are to be distinguished from the goals considered by both the postivists and the realists. See Susskind, Expert Systems in Law, 241; Dworkin, Taking Rights Seriously, 169-171; Summers, "Pragmatic Instrumentalism," 863. Also see Coval and Smith, Law and its Presuppositions, 41-61 and R. S. Summers, "Two Types of Substantive Reasons: The Core of a Theory of Common-Law Justification," Cornell Law Review 63 (1978): 707-788. 53 Deedman, "Building Rule-Based Expert Systems," 74. 54 Ibid., 15. 55 Ibid., 76. Chapter 2. BUILDING THE HEARSAY RULE ADVISOR 56 In early 1988, Tecknowledge gave up its expert system shell work and no longer supports M.1. 57 Ashton-Tate is located in Torrance, California. 58 This program is being written by Keith MacCrimmon. 91 59 Deedman, "Building Rule-Based Expert Systems," 77-78. 60 Tribe, "Triangulating Hearsay," and Richard D. Friedman, "Route Analysis of Credibility and Hearsay," Yale Law Journal 96 (1987): 690-701. 61 Ibid, and Sir Rupert Cross and Colin Tapper, Cross on Evidence, 6th ed. (London: Butterworths, 1985), 457-458. Also see Note, "Theoretical Foundation of Hearsay," 1787, 1794-1799. 62 See Cross and Tapper, Cross on Evidence, 565-570 for the conditions for admissibility of declarations against interest. 63 Marvin Minsky, The Society of Mind, (New York: Simon and Schuster, 1986). 64 For instance, the dying declarations and spontaneous declarations exceptions are often viewed as applying to the same set of facts. See R. v. Mulligan (1973), 23 CR. (N.S.) 1 (Ont. S.C.) and R. v. Presley, [1976] 2 W.W.R. 258 (B.C. S.C.). As another example, the Supreme Court of Canada substantially changed the common law business documents exception in Ares v. Venner, [1970] S.C.R. 608, 14 D.L.R. (3d) 4. The Ontario High Court expanded it further in Setak Computers v. Burroughs Business Machines (1977), 15 O.R. (2d) 750 (Ont. H.C.) 65 See Figs. 1-6 set out in Chapter 3 below at pp. 31-36 and the discussion that accompanies them at pp. 42-43. 66 Cross and Tapper, Cross on Evidence, 565-570, 572-574, 576-578, 587-589. 67 For example, the presence of multiple external injuries with heavy bleeding which would be obviously fatal in the eyes of an observer may found an inference that the victim of an attack knew he was dying. See R. v. Woodcock (1789), 1 Leach 500; 168 E.R. 352 and R. v. Mulligan (1973), 23 CR. (N.S.) 1 (Ont. S.C) but see contra R. v. Morgan (1875), 14 Cox C.C. 337 (Kent Lent Assizes) where the reporter states that the judges were reluctant to draw an inference that the victim knew he was dying from very serious injuries alone. In Morgan, the victim had a serious throat wound which bled heavily and died of his injuries within fifteen minutes. These cases are included in Table 1, p.25-26 below. Woodcock is case no. 31, Mulligan is case no. 22, and Morgan is case no.21. John Henry Wigmore, Evidence in Trials at Common Law, vol. 5, revised by James H. Chadbourn (Boston, Mass.: Little, Brown and Company, 1974). 69 David Binder and Paul Bergman, Fact Investigation, (St. Paul, Minn.: West Publishing Co., 1984), 82-89, 90, 98. 70 Robert B. McCall, Fundamental Statistics for Psychology, (New York: Harcourt Brace Hovanovich, Inc., 1975), 167-169, Table J. at 366-367. 71 See step 11 of FLEX methodology at p. 17 above. 72 R. v. Debortoli, [1926] S.C.R. 492. 73 See Arthur Goodhart, "Determining the Ratio Decidendi of a Case," Yale Law Journal 40 (1930): 170. For example, compare the less than half a page case report 92 for Errington and Others' Case (1838), 2 Lew CC. 148, 168 E.R. 1110 with the pages long report in R. v. Howell (1845), 1 Cox CC. 151 (Crown Case Reserved). 74 In all the dying declaration cases only two indicated what other evidence there was, namely, R. v. Woods (1897), 5 B.C.R. 585 (B.C. Ct. of Crim. App.) and The King v. MacNeill (1945), 18 M.P.R. 175 (P.E.I. S.C). Chapter 3 THE HEARSAY RULE ADVISOR 75 Both these definitions are taken from the proposed Uniform Evidence Act, s.l. This Act can be found in Uniform Law Conference, Report of the Federal-Provincial Task Force on Uniform Rules of Evidence (Toronto: The Carswell Company Limited, 1982), App. 4. The hearsay rule has numerous exceptions. The ones mentioned in this thesis are as follows: Dying declarations - a dying murder victim makes a statement as to the culprit. The rationale of this exception is that a dying person would not wish to meet her or his Maker with a lie on her or his lips. Declarations against interest - a person makes a hearsay declaration that could or will cause her or him a loss, usually financial. It is thought that most persons would not acknowledge a fact which is against their interest (for instance, an admission of negligence in a motor vehicle accident thus exposing themselves to legal liability) unless it is true. Declarations in course of duty - an old common law exception for hearsay statements created by employees in connection with their jobs. The rational for this exception may be that there is too much of a chance of getting caught if the employee is not truthful. Business documents - an exception for written records routinely produced and used by businesses in the conduct of their every day affairs. Businesses would ensure the accuracy of these records because of their reliance on them. Spontaneous declarations - statements that are made in quick response to a startling event, usually traumatic. A common example is statements by the victims of violent crimes as to the perpetrator. The usual justification for this exception is that the declarant has no time to fabricate an untrue story. See Cross and Tapper, Cross on Evidence and Wigmore, Evidence. 76 Ibid., 124-125 and Wigmore, Evidence, 3-10. Note that, besides the reason given, there may have also existed a distrust of the jury's ability to evaluate hearsay evidence at the time the rule developed. See Cross and Tapper, Cross on Evidence, 457. 77 Uniform Law Conference, Report on Uniform Rules of Evidence, 61 and James B. Thayer, Preliminary Treatise on Evidence, (Boston: Little, Brown and Company, 1898), 254-265. 78 Uniform Law Conference, Report on Uniform Rules of Evidence, 125, Tribe, "Triangulating Hearsay," 958. Also see Setak Computers v. Burroughs Business Machines (1977), 15 O.R. (2d) 750 (Ont. H.C) at 762. 79 Wigmore, Evidence, 253. 80 Ibid., 254 Ibid., 253. 93 82 See Federal Rules of Evidence for United States Courts and Magistrates, (St. Paul, Minn.: West Publishing Co., 1975), 106, 138, ss. 803.24 and 804.5. These two American exceptions to the hearsay rule preserve this best evidence principle. 83 Wigmore, Evidence, 252-253. 84 R. v. Osman (1881), 15 Cox CC. 1 (Chester Assizes). 85 Federal Rules of Evidence, ss. 803.24 and 804.5. Both sections are similar. The first applies where the declarant's availability is immaterial; the second applies where the declarant must be unavailable. 86 Tribe, "Triangulating Hearsay," 958-961; Friedman, "Route Analysis," 685; Cross and Tapper, Cross on Evidence, 457-458; Uniform Law Conference, Report on Uniform Rules of Evidence, 125. 87 88 89 90 See Nembhard v. The Queen, [1982] 1 All E.R. 183 (P.C). On credibility of a story see Binder and Bergman, Fact Investigation, 136-145. R. v. Osman (1881), 15 Cox CC 1 (Chester Assizes), 3. Exceptions are two cases involving children: R. v. Pike (1829), 3 C. & P. 598, 172 E.R. 562 (Shropshire Assizes) and R. v. Perkins (1840), 9 C. & P. 395, 173 E.R. 884. 91 Note, "Theoretical Foundations of the Hearsay Rule," 1809. Spontaneous declarations are referred to by the author as "excited utterances." 92 There may be a recent shift in dying declarations cases toward considering them as a type of spontaneous declaration. In modern cases, the declarations tend to qualify both as dying declarations and as spontaneous declarations and both exceptions are considered. See R. v. Mulligan (1973), 23 C.R. (N.S.) 1 (Ont. S.C), R. v. Presley, [1976] 2 W.W.R. 258 (B.C. S.C), and R. v. Moase (1988), unreported (B.C. S.C, Vancoucer Registry X018818, personal communication from Ravi Hira, Crown counsel, August 1988). Modern medicine probably ensures that a victim of an assault will survive to testify if she or he makes it as far as the hospital. Thus the majority of dying declarations will be made within minutes of the actual injury. This makes them seem like spontaneous declarations. Note however that spontaneous declarations have also been challenged as to their rationale. 93 Wigmore, Evidence, 253. 94 R. v. Moase, unreported, B.C. S.C. Vancouver Registry X018818. 95 Ares v. Venner, [1970] S.C.R. 608, 14 D.L.R. (3d) 4. Chapter 4 SELECTED ASPECTS OF EXPERT SYSTEMS IN LAW 96 Compare the account in Deedman, "Building Rule-Based Expert Systems," with this account. 94 97 Note, "Theoretical Foundations of the Hearsay Rules," 1811. This author suggests the judges may be "motivated by a desire, perhaps subconscious, to feel that the system to which they have devoted their energy is worthy of society's acceptance as a system of justice." As a result, they may subconsciously use the hearsay rule to shelter the system from embarassment. 98 Nervous Shock Advisor. Note that this is one of the easier questions to answer. Like the HRA, the NSA uses its share of vague words. For instance: "Was the other person so permanently disfigured, deformed or otherwise altered by the harm they suffered that their appearance would be shocking to the average person?" (emphasis added). 99 The form of conclusions remains to be finalized for the HRA. 100 Susskind, Expert Systems in Law, 27. 101 Ibid., 154, 249. 102 H. L. A. Hart, The Concept of Law (Oxford: Clarendon Press, 1961), 89-96. 103 Susskind, Expert Systems in Law, 241, expresses the positivist's distrust of purpose. 104 Susskind, Expert Systems in Law, 28. 105 But see Susskind, Expert Systems in Law, 237-245 where he assesses the "Argument from Unimportance" and attempts to rebut it. 106 See pp. 58-59 below. 107 Susskind, Expert Systems in Law, 38. Susskind also includes statute and case-law derivations, statute and case-law predictions, and heuristic rules in his system. These are all explained in Expert Systems in Law, 80-110. 108 Douglass T. MacEllven, Legal Research Handbook, 2d ed., (Toronto: Butterworths, 1986), 2. 109 Susskind, Expert Systems in Law, 83. 110 J. C. Smith, Legal Obligation, (Toronto: University of Toronto Press, 1976), 2. 111 Susskind, Expert Systems in Law, 102. 112 Ibid., 55. 113 See ibid., 241. 114 Summers, "Pragmatic Instrumentalism," 887. Also see J. W. Harris, Legal Philosophies (London: Butterworths, 1980), 129. 115 S. C. Coval and J. C. Smith, "Rights, Goals, and Hard Cases," Law and Philosophy 1 (1982): 465-468. 95 116 See Harris, Legal Philosophies, 165-168 and Susskind, Expert Systems in Law, 85. Also see Goodhart, "Ratio Decidendi" and Julius Stone, "The Ratio of the Ratio decidendi," Modern Law Review 22 (1959): 597-620. 117 Hart, The Concept of Law, 131. 118 This is one interpretation of the rules provided by Susskind as illustrations of the Oxford Project in Appendix II to his book. Susskind, Expert Systems in Law, 262-269. 119 Deedman, "Building Rule-Based Expert Systems," 50 and Smith, Legal Obligation, 89. 120 Deedman, "Building Rule-Based Expert Systems," 51 and see Hart, The Concept of Law, 124. 1 rt | L. Thome McCarty, "Intelligent Legal Information Systems," in Data Processing and the Law, ed. Colin Campbell, (London: Sweet and Maxwell, 1984); id., "The TAXMAN Project," in Computer Science and Law, ed. Bryan Niblett, (Cambridge, England: Cambridge University Press, 1980), 23-43; id., "Reflections on TAXMAN," Harvard Law Review 90 (1977): 837-893. 15°. Carole D. Hafner, An Information Retrieval System Based on a Computer Model of Legal Knowledge (Ann Arbor, Mich.: UMI Research Press, 1981), 118-120. 123 Goodhart, "Ratio Decidendi," 170. 124 Ibid., 169 and Susskind, Expert Systems in Law, 182. Also see Gardner, Artificial Intelligence Approach to Legal Reasoning, 48 125 L. Thome McCarty, "Intelligent Legal Information Systems," 126. 126 It is worth quoting from his paper at some length to try to understand what he is referring to. What is our purpose in building a conceptual model of a legal domain? Basically, we are looking for a language in which we can describe legal cases. At a minimum, we want to be able to describe the "facts" of a case at a comfortable level of abstraction, and we want to be able to describe the "law" in our chosen area of application, where the "law" consists of a system of "rules" and "concepts" which specify the rights and the obligations of the various parties. What language should we use? It is well known that different languages incorporate different sets of assumptions about the structure of the world. We are looking here for language which is rich enough to express the important facts about a particular legal world, and yet abstract enough to suppress the irrelevant detail. The purposes of our conceptual model, then, are to specify exactly which of these details should be expressed, and which should be suppressed, and how. To build a model along these lines, we first examine a legal domain, such as corporate tax law, and we identify the conceptual objects and the conceptual relationships which occur in that domain. ... [McCarty, "Intelligent Legal Information Systems," 128] 96 McCarty's use of the word "language" partly refers to the fact that TAXMAN is an original program and does not use a commercial shell. Nevertheless, what seems to come out in the above quote, and is evident in the rest of his paper, is that TAXMAN focusses first on facts and then on their legal consequences. In other words, his system is another example of a fact-oriented approach. 127 Hart, "Positivism and the Separation of Law and Morals," 63-64. 128 Neil D. MacCormick, H. L. A. Hart, (London: Edward Arnold (Publishers) Ltd., 1981), 125. 129 Susskind, Expert Systems in Law, 188. 130 For instance, see questions Q7 through Qll above at p.51-52. 131 Nervous Shock Advisor, Vancouver, B.C.: University of British Columbia, Law and Computers Project. 132 At present, the difference between the two is expressed by the system as "The evidence is admissible" versus "The evidence is probably admissible." 133 See Table 1 above at pp. 25-26. 134 This would turn the system into a prediction program, perhaps similar to those mentioned by Gardner as being a product of work in jurimetrics. Gardner, Artificial Intelligence Approach to Legal Reasoning, 75-76. 135 Susskind, Expert Systems in Law, 192-193. 136 See Chapter 1 above at p. 8 and note 36. 137 Hart, The Concept of Law, 121. 138 This is not always so. For instance, in some statutes, it might be useful to treat them essentially as facts of paradigm cases as it seems McCarty has done with corporate taxation in TAXMAN and as the HRA does with business documents. Conversely, the authoritative case precedent may itself contain such a clear statement of the general rule that it can be dealt with in a manner similar to a statute. 139 Joseph J. Spengler, "Machine-Made Justice: Some Implications," in Jurimetrics, ed. Hans W. Baade, (New York: Basic Books, Inc., 1963), 47-48. Note, however, that Levi thinks this is not strictly correct. Levi, Legal Reasoning, 27-28. 140 R. S. Summers, Lon L. Fuller, (London: Edward Arnold, 1984), 95-96. 141 See Deedman, "Building Rule-Based Expert Systems," 37. 142 Summers, Fuller, 112. 143 See Harris, Legal Philosophies, 150. 144 See Susskind, Expert Systems in Law, 85 and Harris, Legal Philosophies, 165-168. 145 Canada Evidence Act, R.S.C. 1970, c. E-10 as am., s. 30. 146 Evidence Act, R.S.B.C. 1979, c.116 as am., s.48. 147 See McCarty, "Intelligent Legal Information Systems." 148 Hafner, An Information Retrieval System. 149 Levi, Legal Reasoning, 28. 150 Susskind, Expert Systems in Law, 60. 151 See Robert R. Hoffman, "The Problem of Extracting the Knowledge of Experts from the Perspective of Experimental Psychology," Al Magazine (Summer 1987): 60. 152 Ibid. 153 Deedman, "Building Rule-Based Expert Systems," 75. 154 See chapter 3 above. 155 Deedman, "Building Rule-Based Expert Systems," 7-8. 156 See Chapter 1 above at pp. 3-4. 157 For example see David S. Prerau, "Knowledge Acquisition in the Development of a Large Expert System," Al Magazine (Summer 1987): 43-51 and Hoffman, "Extracting the Knowledge of Experts." 158 Susskind, Expert Systems in Law, 25. 159 See Harmon and King, Expert Systems, 53-57, 257. 160 Susskind, Expert Systems in Law, 163-253. 161 Ibid., 208-213. 162 Deedman, "Building Rule-Based Expert Systems," 83-84. 163 Roger Schank and Peter Childers, The Cognitive Computer, (Reading, Mass.: Addison-Wesley, 1984), 33. The authors state: "[Expert systems] are products designed to respond to demands from particular industries, and are not a test of anyone's theory of human cognitive processes." 164 For instance see Robert B. McCall, Fundamental Statistics for Psychology, 46 for a formula with which to calculate the median of a set of numbers. 165 For a sorting algorithm that can be used to find the median of a set of numbers, see Niklaus Wirth, Algorithms + Data Structures = Programs (Englewood Cliffs, New Jersey: Prentice-Hall, Inc., 1976), 82-84. 98 166 See for example, Edwina L. Rissland, "Argument Moves and Hypotheticals," in Computing Power and Legal Reasoning, ed. Charles Walter, (St. Paul, Minn.: West Publishing Co., 1985), 129-143; Heather, "A Demand Driven Model;" Kevin D. Ashley, "Reasoning by Analogy," in Computing Power and Legal Reasoning, ed. Charles Walter, (St. Paul, Minn.: West Publishing Co., 1985), 105-127. Also see the many other papers included in the following books: Colin Campbell, ed., Data Processing and the Law, (London: Sweet and Maxwell, 1984); C. Ciampi, ed., Artificial Intelligence and Legal Information Systems, (New York: North-Holland Pub. Co., 1982); Antonio A. Martino and Fiorenza Socci Natali, eds., Automated Analysis of Legal Texts, (New York: Elseview Science Publishing Company, Inc., 1986). 167 Susskind, Expert Systems in Law, 175-176. 168 Ibid., 12, 174. 169 170 171 172 173 Ibid., 59. See ibid., 64-68. Ibid., 9. See for example QUIC/LAW, Kingston, Ont.: QL Systems Ltd. See Hafner, An Information Retrieval System, 123. Chapter 5 CONCLUSION 174 Susskind, Expert Systems in Law, 53-55. 175 Spengler, "Machine-Made Justice," 43. 176 See Marshall S. Willick, "Professional Malpractice and the Unauthorized Practice of Professions," in Computing Power and Legal Reasoning, ed. Charles Walter, (St. Paul, Minn.: West Publishing Co., 1985), 841-845. 177 See ibid., 829-831, 833-841. 178 Samek, "Language, Communication, Computers and the Law," 210. 179 United Kingdom, Hansard Parliamentary Debates (House of Lords) 5th ser., vol.273, 1966, col.721. 99 BIBLIOGRAPHY BOOKS Binder, David, and Paul Bergman. Fact Investigation: From Hypothesis to Proof. St. Paul, Minn.: West Publishing Co., 1984. Bing, Jon, and T. Harvold. Legal Decisions and Information Systems. Olso, Norway: Universitetsforlaget, 1977. Born, R., ed. Artificial Intelligence: The Case Against. London and Sydney: Croom Helm, 1987. Buchanan, Bruce G., and E. H. Shortliffe, eds. Rule-Based Expert Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project. Menlo Park, California: Addison-Wesley Publishing Co., 1985. Campbell, Colin, ed. Data Processing and the Law. London: Sweet & Maxwell, 1984. Ciampi, C, ed. Artificial Intelligence and Legal Information Systems. New York: North-Holland Pub. Co., 1982. Coval, S. C, and J. C. Smith. Law and its Presuppositions. London: Routledge & Kegan Paul, 1986. Cross, Sir Rupert, and Colin Tapper. Cross on Evidence, 6th ed. London: Butterworths, 1985. Dworkin, Ronald. Taking Rights Seriously. Cambridge, Mass.: Harvard University Press, 1977. Ewart, J. Douglas. Documentary Evidence in Canada. Toronto: Carswell Legal Publications, 1984. Gardner, Anne v. d. L. An Artifical Intelligence Approach to Legal Reasoning. Cambridge, Mass.: The MIT Press, 1987. Hafner, Carole D. An Information Retrieval System Based on a Computer Model of Legal Knowledge. Mich.: UMI Research Press, 1981. Harmon, Paul, and David King. Expert Systems: Artificial Intelligence in Business. New York: John Wiley & Sons, 1985. Harris, J. W. Legal Philosophies. London: Butterworths, 1980. Hart, H. L. A. Essays in Jurisprudence and Philosophy. Oxford: Clarendon Press, 1983. Hart, H. L. A. The Concept of Law. Oxford: Clarendon Press, 1961. 100 Levi, Edward H. An Introduction to Legal Reasoning. Chicago: University of Chicago Press, 1948. MacCormick, Neil. H. L. A. Hart. London: Edward Arnold (Publishers) Ltd., 1981. MacEllven, Douglass T. Legal Research Handbook, 2nd ed. Toronto: Butterworths, 1986 Martino, Antonio A., and Fiorenza Socci Natali, eds. Automated Analysis of Legal Texts: Logic, Informatics, Law. New York: Elsevier Science Publishing Company, Inc., 1986. McCall, R. B. Fundamental Statistics for Psychology, 2nd ed. New York : Harcourt Brace Jovanovich, Inc., 1975. Minsky, Marvin. The Society of Mind. New York: Simon and Schuster, 1986. Negoita, Constantin V. Expert Systems and Fuzzy Systems. Menlo Park, Calif.: The Benjamin/Cummings Publishing Company, Inc., 1985. Niblett, Bryan, ed. Computer Science and Law. Cambridge, England: Cambridge University Press, 1980. Perelman, Chaim. Justice, Law and Argument. Dordrecht, Holland: D. Reidel Publishing Company, 1980. Smith, J. C. Legal Obligation. Toronto, Ont.: University of Toronto Press, 1976. Summers, Robert S. Lon L. Fuller. London: Edward Arnold, 1984. Susskind, Richard E. Expert Systems in Law: A Jurisprudential Inquiry. Oxford: Clarendon Press, 1987. Thayer, James B. Preliminary Treatise on Evidence at the Common Law. Boston: Little, Brown and Company, 1898. Uniform Law Conference of Canada. Report of the Federal/Provincial Task Force on Uniform Rules of Evidence. Toronto: Carswell Legal Publications, 1982. Walter, Charles, ed. Computing Power and Legal Reasoning. St. Paul, Minn.: West Publishing Co., 1985. Waterman, Don A. A Guide to Expert Systems. Don Mills, Ontario: Addison-Wesley Publishing Company, 1986. Wigmore, John H. Evidence in Trials at Common Law, vol. 5. Revised by J. H. Chadbourn. Boston: Little, Brown and Company, 1974. Wirth, Niklaus. Algorithms + Data Structures = Programs. Englewood Cliffs, New Jersey: Prentice-Hall, Inc., 1976. 101 ARTICLES Ashley, Kenneth D. "Reasoning by Analogy: A Survey of Selected Al Research with Implications for Legal Expert Systems." In Computing Power and Legal Reasoning, edited by Charles Walter, 105-127. St. Paul, Minn.: West Publishing Co., 1985. Bechtler, T. W. "American Legal Realism Revaluated." In Law in a Social Context, edited by T. W. Bechtler, 5-48. The Netherlands: Kluwer, 1978. Bing, Jon. "Legal Norms, Discretionary Rules and Computer Programs." In Computer Science and Law, edited by Bryan Niblett, 119-136. Cambrige, England: Cambridge University Press, 1980. Buchanan, Bruce G., and Thomas E. Headrick. "Some Speculation About Artificial Intelligence and Legal Reasoning." Stanford Law Review 23 (1970): 40-62. Coval, S. C. and J. C. Smith. "Rights, Goals, and Hard Cases." Law and Philosophy 1 (1982): 451-480. David, Aurel. "Heuristics." In Automated Analysis of Legal Texts, edited by A. A. Martino and F. Socci Natali, 63-68. New York: Elsevier Scientific Publications Co., 1986. Davis, Randall, and Jonathan J. King. "The Origin of Rule-Based Systems in Al." In Rule-Based Expert Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project, edited by B. G. Buchanan and E. H. Shortliffe, 20-52. Don Mills, Ont.: Addison-Wesley Publishing Company, 1984. Deedman, G. C. "Rule-Based Expert Systems in Case-Based Law." LL.M. Thesis, University of British Columbia, 1987. Dyer, M. G., and Margot Flowers. "Toward automating legal expertise." In Computing Power and Legal Reasoning, edited by Charles Walter, 49-68. St. Paul, Minn.: West Publishing Co., 1985. Friedman, Richard D. "Route Analysis of Credibility and Hearsay." Yale Law Journal 96 (1987): 667-742. Fuller, Lon L. "Positivism and Fidelity to Law ~ A Reply to Professor Hart." Harvard Law Review 71 (1958): 630-672. Gardner, Anne v. d. L. "Overview of an Artificial Intelligence Approach to Legal Reasoning." In Computing Power and Legal Reasoning, edited by Charles Walter, 247-274. St. Paul, Minn.: West Publishing Co., 1985. Goebel, Jurgen G., and Reinhard Schmalz. "Problems of Applying Legal Expert System in Legal Practice." In Automated Analysis of Legal Texts, edited by A. A. Martino and F. Socci Natali, 613-623. New York: Elsevier Science Publication Co., 1986. Goodhart, Arthur. "Determining the Ratio Decidendi of a Case." Yale Law Journal 40 (1930): 161-183. Gray, Grayfred. "Workshop Report." In Computing Power and Legal Reasoning, edited by Charles Walter, 621-626. St. Paul, Minn.: West Publishing Co., 1985. Hart, H. L. A. "Positivism and the Separation of Law and Morals." In Essays in Jurisprudence and Philosophy, 49-87. Oxford: Clarendon Press, 1983. Heather, Michael A. "Future Generation Computer Systems in the Service of the Law." In Automated Analysis of Legal Texts, edited by A. A. Martino and F. Socci Natali, 643-660. New York: Elsevier Science Publishing Co., 1986. . "A Demand-Driven Model for a Half-Intelligent System." In Computing Power and Legal Reasoning, edited by Charles Walter, 69-103. St. Paul, Minn.: West Publishing Co., 1985. Hermann, Donald H. J. "Legal Reasoning as Argumentation." Northern Kentucky Law Review 12 (1985): 467-510. Hoffman, Robert R. "The Problem of Extracting the Knowledge of Experts from the Perspective of Experimental Psychology." Al Magazine Summer (1987): 53-67. Liebowitz, J., and J. S. Zeide. "EVIDENT: An Expert System Prototype for Helping the Law Student Learn Admissibility of Evidence Under the Federal Rules." Computers in Education 11 (1987) 113 - 120. McCarty, L. T. "Intelligent Legal Information Systems: Problems and Prospects." In Data Processing and the Law, edited by Colin Campbell, 125-151. London: Sweet and Maxwell, 1984. . "The TAXMAN Project: Towards a Cognitive Theory of Legal Argument." In Computer Science and Law, edited by Bryan Niblett, 23-43. Cambridge, England: Cambridge University Press, 1980. . "Reflections on TAXMAN: An Experiment in Artificial Intelligence and Legal Reasoning." Harvard Law Review 90 (1977): 837 - 893. Note, "The Theoretical Foundation of the Hearsay Rules." Harvard Law Review 93 (1980): 1786-1815. Peterson, M. A., and D. A. Waterman. "An Expert Systems Approach to Evaluating Product Liability Cases." In Computing Power and Legal Reasoning, edited by Charles Walter, 627-659. St. Paul, Minn.: West Publishing Co., 1985. Philipps, L. "Using an Expert System in Testing Legal Rules." In Automated Analysis of Legal Texts, edited by A. A. Martino and F. Socci Natali, 703 710. New York: Elsevier Scientific Publications Co., 1986. Prerau, David S. "Knowledge Acquisition in the Development of a Large Expert System." Al Magazine (Summer 1987): 43-51. 103 Reisinger, L. "Approach to a Theory of 'Soft Algorithms'." In Automated Analysis of Legal Texts, edited by A. A. Martino and F. Socci Natali, 761-722. New York: Elsevier Scientific Publications Co., 1986. Rissland, Edwina L. "Argument Moves and Hypotheticals." In Computing Power and Legal Reasoning, edited by Charles Walter, 129-143. St. Paul, Minn.: West Publishing Co., 1985. Samek, R. S. "Language, Communication, Computers and the Law." Dalhousie Law Journal 9 (1985): 196-227. Smith, J. C, and G. C. Deedman. "The Application of Expert Systems Technology to Case-Based Law." In Proceedings of the First International Conference on Artificial Intelligence and Law, 84-93. Boston, Mass.: The Center for Law and Computer Science, Northeastern University, 1987. Spengler, Joseph J. "Machine-Made Justice: Some Implications." In Jurimetrics, edited by Hans W. Baade, 36-52. New York: Basic Books, Inc., 1963. Sprowl, James A. "The Automated Assembly of Legal Documents." In Computer Science and Law, edited by Bryan Niblett, 195-205. Cambridge, England: Cambridge University Press, 1980. Stone, Julius. "The Ratio of the Ratio Decidendi." Modern Law Review 22 (1959): 597-620. Summers, Robert S. "Pragmatic Instrumentalism in Twentieth Century American Legal Thought ~ A Synthesis and Critique of our Dominant General Theory About Law and its Use." Cornell Law Review 66 (1981): 861-948. . "Two Types of Substantive Reasons: The Core of a Theory of Common Law Justification." Cornell Law Review 63 (1978): 707-788. Susskind, Richard E. "Expert Systems in Law: A Jurisprudential Approach to Artificial Intelligence and Legal Reasoning." Modern Law Review 49 (1986): 168-194. Thorne, J., and M Ugarte. "Using Purposes of the Law in Legal Expert Systems." In Automated Analysis of Legal Texts, edited by A. A. Martino and F. Socci Natali, 711-716. New York: Elsevier Scientific Publications Co., 1986. Tribe, Lawrence. "Triangulating Hearsay." Harvard Law Review 87 (1974): 957-974. Twining, William. "Talk About Realism." New York University Law Review 60 (1985): 329-384. United Kingdom, Hansard Parliamentary Debates (House of Lords), 5th ser., vol. 273, 1966. Van Doren, J. W. "Impasse: Is There a Beyond?" Western State University Law Review 13 (1986): 493-512. 104 Walter, C. F. "Teleogenic Behavior Based on Granular Control." In Automated Analysis of Legal Texts, edited by A. A. Martino and F. Socci Natali, 717-733. New York: Elsevier Scientific Publications Co., 1986. Willick, Marshall S. "Professional Malpractice and the Unauthorized Practice of Professions: Some Legal and Ethical Aspects of the Use of Computers as Decision-Aids." In Computing Power and Legal Reasoning, edited by Charles Walter, 817-863. St. Paul, Minn.: West Publishing Company, 1985. COMPUTER PROGRAMS dBase III Plus ver. 1.1. Torrance, California: Ashton-Tate. Nervous Shock Advisor. Vancouver, B.C.: University of British Columbia, Law and Computers Project. M.I. Palo Alto, Calif.: Teknowledge, Inc. QUIC/LAW. Kingston, Ont.: QL Systems, Ltd. GLOSSARY 105 Algorithm - a systematic method or means of solving a problem which guarantees a correct result. Backward chaining (back chaining) - a control strategy in an expert system where the inference engine moves from consideration of conclusions to consideration of premisses. It begins by checking the top-level rule in the system to see if it is true. If the rule is not satisfied, then the premisses of that rule become the new goals and the system searches for the rules that contain those goals as conclusions. The process is repeated until a conclusion is satisfied (either by asking a user for information or retrieving a stored value). Consultation - the interaction of an expert system with a user from the first questions through to the solution given by the system. Domain - the field of knowledge or area of expertise that is computerized in an expert system. Expert system - an intelligent computer program that can solve problems which would require significant expertise for a human being to solve. Flexibility - the ease with which a completed knowledge base can be changed to accomodate changes in the domain. Forward chaining - a control strategy in an expert system where the inference engine begins by checking the premisses of rules. Where premisses are satisfied, conclusions can be drawn and new rules considered based on the new information available. The process continues until a goal is reached or there are no more rules to consider. Heuristic - a rule of thumb or other simplification that limits search in large problem spaces. It helps a problem solver to select the most useful knowledge out of all the knowledge that she or he might bring to bear on the problem. 106 HRA - Hearsay Rule Advisor - an expert system that advises lawyers on admissibility of hearsay evidence in court, built in 1987 and 1988 by S. J. Blackman and M. T. MacCrimmon of the Law Faculty, University of British Columbia. Inference engine - the part of the expert system that contains the inference and control strategies of the system. The inference engine interacts with the user both to ask for information and to provide conclusions and searches through the knowledge base drawing inferences, satisfying goals and producing conclusions. Interface - the portion of a computer program which interacts with a person or with external programs. Expert systems usually have an interface for the user and an interface for the knowledge engineer. They also may have interfaces to connect them with external files such as databases. Knowledge base - the portion of an expert system that consists of the expert's knowledge expressed as facts and rules. The rules are usually modus ponens or 'if x, then j? rules. Knowledge engineer - the person who interviews the domain expert, organizes the knowledge obtained, and then creates the knowledge base for the expert system. M.l - a commercial expert system shell descended from MYCIN. It consists of a backward chaining inference engine and a user interface. Meta-rule - a rule about rules, that is, a second or higher order rule which tells something about the first or lower order rules. MYCIN - the most famous expert system to date. Its domain is bacterial infections of the blood. It was built at Stanford University between 1970 and 1980. NSA - Nervous Shock Advisor - an expert system that advices lawyers about the law of nervous shock, built in 1986 and 1987 by G. C. Deedman and J. C. Smith of Law Faculty, Unvirsity of British Columbia. Shell - a computer program, used in developing expert systems, which consists of only the inference engine and any necessary interfaces. 107 Transparency - the expert system's ability to explain its reasoning during a consultation with a user. User - the person who consults the expert system for advice, also called the end-user. 108 APPENDIX A Structure of the Hearsay Rule Advisor Figures A-l to A-8 show the structure of the Hearsay Rule Advisor. Names in boxes are expression names. The top-level goal is 'consultation-over.' The top-level rule indicates that premisses 'begin-consultation,' 'definition-applies,' and 'exception-applies' must be satisfied before a value can be determined for the top-level goal. 'Begin-consultation' is searched and this leads first to a display, then to a premiss of 'admissions.' A question is asked of the user to determine a value for the expression 'admissions.' Once satisfied, the system moves on to consider 'definition-applies.' This sub-goal takes it to the premiss (new subgoal) 'hearsay,' which in turn leads to 'proposition,' and eventually to questions for the user. When that chain is satisfied, the system returns to 'hearsay' and checks for more unsatisfied premisses. It finds 'statement,' takes that as its next sub-goal, and then proceeds down the new chain. The whole system is searched in this way until the top-level goal is satisfied. The system moves through the diagrams from top to bottom, then left to right, and from the earlier diagram to the later one. Fig. A-1. Overview begin-consultation Fig. A-2 consultation-over advice-given definition-applies Fig. A-2 exception=dying-declaration exception= declarations-against-interest exception= business-documents exception-applies exception= declarations-in-course-of-duty Figs. A-3, A-4 Figs. A-5, A-6 Fig. A-7 Fig. A-8 Fig. A-2. Definition of Hearsay consultation-over begin-consultation i — display introduction 1 admissions r I advice-given i — definition-applies x 1 [ exception-applies" [ hearsay ) relevance-problem to Figs. A-3 to A-8 proposi tion=no (——N sex >-name i. hearsay-problem statement non-hearsiy"] rule-applies relevant-to-show-. made . relevant-for-fact-asserted relevant-for-•fact-implied> ( proposi tion=yes f sex i. , [ name hearsay-problem i ? ? ? asserts-a-fact type-of-evidence i ? declarant-testify ) H O Fig. A-3. Dying Declarations - event, perceive, believe exception-applies exception=dying-declaration event-dying-declaration believe-dying-declaration perceive-dying-declaration ( competent ) declarant-dying intend-dying-declaration To Fig. A-4 facts [ personal-know ledg"e~^| type-of-trial declarant-dead homicide declarant-dying concerns-cause-of-death Fig. A-4. Dying Declarations - intend intend-dying-declaration / ' N settled-hopeless-expectation V , J no-case or very-weak-case or weak-case or strong-case or very-strong-case declarant-belief serious-injuries cancelled declarant-told-dying f acknowledge) —P- bystanders-belief H H to Fig. A-5. Declarations Against Interest - event, perceive exception-applies event-against-interest X declarant-dead (exception=against-interest perceive-against-interest believe-against-interest intend-against-interest To Fig. A-6. facts) [ personal-knowledge"] ( loss-at-same-time"] [ declarant-knew against-interest facts existing-legal-obligation forgo-legal-benefit limited-interest-in-property expose-to-liability H H CO Fig. A-6. Declarations Against Interest - believe, intend \ / intend-agalnst-^ interest J declarant-knew 1 f col lateral-part 1 *• -'Si-' on-the-whole loss-at-same-time declarant-knew H H Fig. A-7. Business Documents exception-applies exception=business-documents event-business-documents type-of-evidence normal-function believe-business-documents contemporaneous perceive-business-documents intend-business-documents I _ answer-compiling-intent ? r \ intent evidence-act I ask-jurisdictlon answer-personal-knowledge I competent-to-testify relied-upon 1 ? personal-knowledge 1 declarant-informed 1 '  ' ( function-to-inform Ul Fig. A-8. Declarations in Course of Duty exception-applies exception=duty perceive-duty believe-duty OR perceive-business-documents contemporaneous regular-part-of-job intend-duty motive-to-misrepresent ( tampering personal-knowledge i experience others-relied H H 117 APPENDIX B Knowledge Base of the Hearsay Rule Advisor /* THE HEARSAY RULE ADVISOR (5)1988 S. J. Blackman and M. T. MacCrimmon */ /* BEGINNING OF THE KNOWLEDGE BASE */ /* This knowledge base decides whether or not the user's evidence is hearsay and whether an exception applies so that it will be admissible in court. The knowledge base is divided into four parts. Part 1 contains the top-level rules that control the whole consultation. It also contains some introductory and concluding messages. Part 2 contains the rules and questions that ask the user about his/her evidence and make decisions about whether or not it is hearsay. The questions in this part of the knowledge base are arranged in alphabetical order by their expression names. Part 3 contains the high-level exception control rules, that is, those rules which decide which exception applies, if any. Part 4 contains the questions and rules that ask the user about the circumstances surrounding the creation of his/her piece of evidence in order to decide which exception, if any, applies.*/ initialdata = [consultation-over]. /* *** PART 1 *** */ rule-defOOl: if display([nl, nl,'\t Welcome to the Hearsay Rule Adviser.', nl, nl, '\t This system will ask you questions about your ', nl, '\t evidence and then advise you whether or not it is ', nl, '\t hearsay. A hearsay statement is a statement by a ', nl, '\t person made other than while testifying as a ', nl, '\t witness at the proceeding that is offered in ', nl, '\t evidence to prove the truth of the matter asserted.', nl,nl, '**This system does not deal with any other exclusionary rule such as privilege or the opinion rule or character evidence. It also does not address the issue of whether a voir dire is necessary to determine the admissibility of a statement.**', nl, nl, '**The traditional common law exceptions to the rule against hearsay that are presently included in this system are the dying declarations exception, the declarations against proprietary or pecuniary interest exception, the declarations in course of duty exception, and the business documents exception. The statutory business documents exceptions are also included. More material is constantly being added.**', nl, nl]) then begin-consultation. 118 rule-deflOO: if begin-consultation and admissions = yes and display([nl, 'This system does not deal with admissions and confessions. You might wish to start your research with Kaufmann on Confessions or Sopinka and Lederman on Admissions.', nl, 'The consultation is over.',nl,nl, 'Thank you for consulting the hearsay rule adviser.',nl,nl]) then consultation-over. explanation(rule-def 100) = [nl, 'This system does not deal with admissions and confessions by parties. Note that vicarious statements include statements by employees, agents and partners. However, the declarations in course of duty exception to the hearsay rule is included in this system.', nl]. rule-def002: if begin-consultation and admissions = no and advice-given and display([nl, 'The consultation is bver.',nl,nl, 'Thank you for consulting the hearsay rule adviser.',nl,nl]) then consultation-over. /* ADMISSIONS */ question(admissions) = 'Is this evidence a statement by a party (including vicarious statements) offered by an adverse party or a statement by the accused offered by the Crown?'. legalvals(admissions) = [yes, no]. automaticmenu(admissions). /* ADVICE-GIVEN */ rule-def003: if definition-applies and exception-applies is known then advice-given. rule-def004: if definition-applies = no then advice-given. rule-def005: if definition-applies = maybe then advice-given. /* *** PART 2 - DEFINING HEARSAY *** */ /* This sequence of rules decides whether or not the user has a hearsay problem, also contains the high level rules which satisfy the expressions definition-applies and exception-applies (see rule-def003 above).*/ /* ASSERTS-A-FACT */ rule-def006: if evidence = oral or evidence = written or evidence = conduct then asserts-a-fact = yes. rule-def007: if evidence = non-assertion then asserts-a-fact = no. /* DECLARANT-TESTIFY */ question(declarant-testify(NAME)) = ['Are you able to produce \NAME,' in court to testify?']. legalvals(declarant-testify(NAME)) = [yes,no]. automaticmenu(declarant-testify(NAME)). rule-def008: if sex = SEX and name = NAME and declarant-testify(NAME) = yes then declarant-testify = yes. explanation(rule-def008) = [nl, 'For many of the common law exceptions to the rule against hearsay it is crucial that the declarant not be available to testify in court. In-court testimony is usually preferable to hearsay evidence.' nl]. rule-def009: if sex = SEX and name = NAME and declarant-testify(NAME) = no then declarant-testify = no. /* DEFINITION-APPLIES */ rule-defOlO: if hearsay then definition-applies. 120 rule-defOll: rule-def012: if hearsay = no or relevance-problem then definition-applies = no. if hearsay = maybe then definition-applies = maybe. /* EXCEPTION-APPLIES rule-def013: if exception is known and not cached(exception = none) then exception-applies. rule-def014: if exception = none then exception-applies = no. /• HEARSAY */ rule-def015: if proposition = no then hearsay. rule-def016: rule-def017: rule-def018: if statement is known and non-hearsay = no and rule-applies and display([nl, 'Your evidence is hearsay and is not admissible unless it fits within an exception to the hearsay rule.', nl,nl]) then hearsay. if statement is known and non-hearsay = yes and rule-applies and display([nl, 'Your evidence is admissible only for the non-hearsay purpose you have indicated. It is not admissible for the hearsay purpose unless it fits within an exception to the hearsay rule.', nl,nl]) then hearsay. if statement is known and non-hearsay = yes and rule-applies and relevant-for-fact-implied and display([nl, 'Your evidence is admissible only for the non-hearsay purpose you have indicated. It is not admissible for the hearsay purpose unless 121 it fits within an exception to the hearsay rule. Note that an assertion used for an implication that can be drawn from it may also be hearsay. Please tune in later for further information on this.', nl,nl]) then hearsay. rule-def019: if statement is known and non-hearsay = no and rule-applies and relevant-for-fact-implied and display([nl, 'Your evidence is hearsay and is not admissible unless it fits within an exception to the hearsay rule. Note that an assertion used for an implication which may be drawn from it may also be hearsay. Please be patient. The parts of the knowledge base that will deal with implications are still being built.', nl,nl]) then hearsay. rule-def020: if asserts-a-fact = no and non-hearsay = no and display([nl, 'Non-assertive conduct may be hearsay. This part of the knowledge base is still under construction.', nl,nl]) then hearsay = maybe. rule-def021: if asserts-a-fact = no and non-hearsay = yes and display([nl, 'Your evidence is admissible for the non-hearsay purpose you have indicated. Note that non-assertive conduct may still be hearsay. This part of the knowledge base is being built. Tune in at a later date.', nl,nl]) then hearsay = maybe. rule-def022: if statement is known and non-hearsay = no and not cached(rule-applies) and relevant-for-fact-implied and display([nl, 'Your evidence may be admissible but note that an assertion offered for its implication may be hearsay. This part of the knowledge base is still under construction.', nl,nl]) then hearsay = maybe. rule-def023: if statement is known and non-hearsay = yes and not cached(rule-applies) and relevant-for-fact-implied = yes and 122 display([nl, 'Your evidence is admissible for the non-hearsay purpose you have indicated. Note that an assertion offered for its implication may be hearsay. This part of the knowledge base is still under construction.', nl,nl]) then hearsay = maybe. rule-def024: if statement is known and non-hearsay = yes and not cached(rule-applies) and display([nl, 'Your evidence is admissible for the non-hearsay purpose you have indicated.', nl,nl]) then hearsay = no. /* HEARSAY-PROBLEM */ question(hearsay-problem) = [nl, 'Would you like me to help you decide whether or not you have a hearsay problem? If not, I will assume that you do have a hearsay problem and will proceed to examine the exceptions to the hearsay rule.']. legalvals(hearsay-problem) = [yes,no]. automaticmenu(hearsay-problem). /* MAKER-UNKNOWN */ question(maker-unknown) = 'Was the statement made by more than one person or is the maker of the statement unknown?'. legalvals(maker-unknown) = [one-person, more-than-one-person, maker-unknown]. automaticmenu(maker-unknown). enumeratedanswers(maker-unknown). /* NON-HEARSAY */ rule-def025: if relevant-to-show-made = yes then non-hearsay = yes. explanation(rule-def025) = [nl, 'The concept of relevance is central to a determination of whether or not you wish to use your evidence for a hearsay purpose therefore I am trying to find out just how you will be using this evidence in court.', nl]. rule-def026: if relevant-to-show-made = no then non-hearsay = no. 123 /* PROPOSITION */ rule-def027: if sex = SEX and name = NAME and hearsay-problem = yes and display([nl, 'Please think about the proposition you want to establish in your case. Consider what it is that you want to prove.', nl,nl]) then proposition. explanation(rule-def027) = [nl, T am prepared to ask you questions to help you determine whether or not you wish to use your evidence for a hearsay purpose. Note that all you may need to do is use it for a non-hearsay purpose and thus get it before the court. However, if you do not wish to consult this part of the program, I am prepared to proceed immediately with the exceptions to the rule against hearsay.', nl]. rule-def028: if sex = SEX and name = NAME and hearsay-problem = no and display([nl, 'The next step is to determine whether an exception to the hearsay rule will apply in your case and allow your evidence to be admitted.', nl.nl]) and evidence is known and declarant-testify is known then proposition = no. RELEVANCE-PROBLEM */ rule-def029: if statement is known and relevant-to-show-made = no and relevant-for-fact-asserted = no and relevant-for-fact-implied = no and display([nl, 'The way you have answered these questions indicates to me that your evidence may not be relevant at all in which case you will not be using it. Please rethink your problem.', nl,nl]) then relevance-problem. /* RELEVANT-FOR-FACT-ASSERTED */ question(relevant-for-f act-asserted) = 'To prove the proposition, do you need to use the statement for the truth of its contents?'. legalvals(relevant-for-fact-asserted) = [yes,no]. automaticmenu(relevant-for-f act-asserted). 124 /* RELEVANT-FOR-FACT-IMPLIED */ question(relevant-for-f act-implied) = 'As proof of your proposition, do you need to use the statement for an inference that can be drawn from it?'. legalvals(relevant-for-f act-implied) = [yes,no]. automaticmenu(relevant-for-f act-implied). /* RELEV ANT-TO-SHOW-MADE */ question(relevant-to-show-made) = 'To prove the proposition, do you wish to use the statement to show that it was made (a non-hearsay purpose)? (Select the 3rd option to see examples of cases where evidence that would otherwise be hearsay is admissible for this non-hearsay purpose.)'. legalvals(relevant-to-show-made) = [yes,no,examples]. automaticmenu(relevant-to-show-made). enumeratedanswers(relevant-to-show-made). presupposition(examples-cycle-N) = (relevant-to-show-made = examples). rule-def030: if cycle-l-is-complete and repeat-question = yes then non-hearsay = yes. rule-def031: if cycle-l-is-complete and repeat-question = no then non-hearsay = no. rule-def032: if examples-over-cycle-N and user-happy-with-selection-N then cycle-N-is-complete. rule-def033: if nextcycle-to-M = N and cycle-N-is-complete then cycle-M-is-complete. rule-def034: if M+l = N then nextcycle-to-M = N. rule-def035: if M>1 and M-l = N then previouscycle-to-M = N. 125 multivalued(examples-cycle-N). question(examples-cycle-N) = 'Choose one of the following: 1. defamation 2. state of mind of recipient of statement 3. state of mind of declarant of statement 4. words which have legal significance in themselves 5. words clarifying ambiguous actions 6. conduct that has legal significance 7. conduct that is relevant to the issues in the case'. legalvals(examples-cycle-N) = number. rule-def036: if examples-cycle-N = 4 and display([ 'Sandhu is suing Black for breach of contract to supply computer disks. Black denies that he agreed to supply the computer disks. Sandhu calls Baker as a witness. She will testify that in response to Sandhu"s statement, "I would like 100 computer disks at $1.00 a disk", Black said, "O.K. I will deliver them next Thursday." ', nl, nl, 'Under the substantive law of contract, the very speaking of the words constituting an offer and an acceptacne has legal significance. When the statements are offered by Sandhu to prove the existence of an oral contract, the statements are not hearsay.', nl,nl]) then examples-over-cycle-N. rule-def037: if examples-cycle-N = 5 and display([ 'A witness proposes to testify that she saw Auntie Mame give a diamond necklace to Lisa last year, saying, "I would like you to have this before I die." At trial, the issue is whether Anutie Mame made a gift of the necklace to her niece Lisa.', nl, nl, Tn this case, proof of a gift requires that Auntie Mame hand the necklace to Lisa with the intention of transferring title to her. The issue, once the act of handing over the property is shown, is whether the owner of the property intended to transfer title. The statement explains the legal significance of the ambiguous act of handing over the property. (Auntie Mame could have been only lending the necklace.)', nl,nl]) then examples-over-cycle-N. rule-def038: if examples-cycle-N = 3 and display([ 'At a murder trial one of the issues is whether the deceased committed suicide. The Crown calls a friend of the deceased who testifies that two days before the deceased died, he said, "I am really looking forward to my trip to Hawaii." ', nl, nl, 'Evidence of a person"s happy state of mind two days before he died is the basis 126 of an inference that the happy state of mind continued until the time of his death. Evidence of a happy state of mind at the time of death makes the proposition that the accused committed suicide less likely than it would be in the absence of the evidence. The declarant/deceased is not asserting that he is happy or that he does not intend to commit suicide. The statement implies the declarant"s state of mind and is not offered for the truth of the matter asserted. Statements which imply the declarant"s existing (as compared to past) state of mind, when offered as evidence of that state of mind, are not hearsay.', nl,nl]) then examples-over-cycle-N. rule-def039: if examples-cycle-N = 2 and display([ 'Smith is charged with the murder of Chou. Chou dies as a result of a stab wound inflicted by Smith during a fist fight in Smith"s home. Smith and Chou are business partners. Baker testifies that two days before the fight, Chou yelled at Smith: "You have been cheating me. I will get you for this." Smith"s counsel is advancing an argument of self defence on behalf of Smith.', nl, nl, 'The statement is hearsay if it is offered for the truth of Chou"s assertion that Smith is cheating him. But the statement is not hearsay if it is offered as the basis of an inference that Smith was afraid of Chou (an instance of "effect upon hearer" use of the statement). Smith"s fear of Chou may be inferred from the fact that Chou threatened Smith. The truth of the inference of Smith"s fear of Chou does not depend on the truth of the fact asserted.', nl, nl]) then examples-over-cycle-N. rule-def040: if (examples-cycle-N = 1 or examples-cycle-N = 6 or examples-cycle-N = 7) and display([ 'Please be patient. I do not yet have any examples to show you.', nl, nl]) then examples-over-cycle-N. question(choose-another-cycle-N) = 'Do you wish to choose another example from the list?'. legalvals(choose-another-cycle-N) = [yes,no]. automaticmenu(choose-another-cycle-N). rule-def041: if choose-another-cycle-N = no then user-happy-with-selection-N. 127 /* REPEAT-THE-QUESTION */ question(repeat-question) = 'Do you wish to use the statement to show that it was made (a non-hearsay purpose)?'. legalvals(repeat-question) = [yes.no]. automaticmenu(repeat-question). /* RULE-APPLIES */ rule-def042: if relevant-for-fact-asserted = yes and relevant-for-fact-implied is known and display([nl, T conclude that you wish to use your evidence for a hearsay purpose.', nl.nl, 'A hearsay statement is a statement offered in evidence to prove the truth of the matter asserted but made otherwise than in testimony at the proceeding in which it is offered. ~ Uniform Evidence Act, Report of the Federal-Provincial Task Force on Uniform Rules of Evidence, 1982.', nl, nl]) then rule-applies. explanation(rule-def042) = [nl, 'The concept of relevance is central to a determination of whether or not you wish to use your evidence for a hearsay purpose therefore I am trying to find out just how you will be using this evidence in court.', nl]. /* SEX-AND-NAME */ rule-sexOOl: if maker-unknown = neither and ask-sex = SX and SX == he then sex = [he,him]. rule-sex002: if maker-unknown = neither and SX == she then sex = [she,her]. rule-sex003: if maker-unknown = several-declarants and display([nl, 'From this point on I shall refer to the makers of the evidence as "the declarants.'", nl]) then sex = [they.them] . 128 rule-sex004: if maker-unknown = declarant-not-known and display([nl, 'From this point on I shall refer to the maker of the evidence as "the declarant" and I shall assume this unknown person is female.', nl]) then sex = [she, her]. legalvals(sex) = LIST. question(ask-sex) = Ts the person who made the statement a woman or a man?'. legalvals(ask-sex) = [she,he]. automaticmenu(ask-sex). enumeratedanswers(ask-sex). rule-sex005: if maker-unknown = more-than-one-person then name = 'the declarants'. rule-sex006: if maker-unknown = maker-unknown then name = 'the declarant'. question(name) = ['What name should I use to refer to this person? (Please enclose it in quotes)']. legalvals(name) = LIST. /* STATEMENT */ rule-def043: if proposition and asserts-a-fact = yes and declarant-testify is known and display([nl, nl, 'Recall the proposition you are trying to establish and please keep it in mind when you answer the next questions.', nl, nl]) then statement. /* EVIDENCE */ presupposition(type-of -evidence) = (hearsay-problem = yes). question(type-of-evidence) = 'What type of evidence do you have? 1. oral assertion 2. written or recorded assertion including computer printouts and tape recordings 3. assertive conduct 4. non-assertive oral, written or recorded statement or conduct'. 129 legalvals(type-of-evidence) = number. presupposition(type-of-evidence-for-exceptions) = (hearsay-problem = no). question(type-of-evidence-for-exceptions) = 'What type of evidence do you have? 1. oral 2. written or recorded including computer printouts and tape recordings 3. conduct'. legalvals(type-of-evidence-for-exceptions) = number. rule-def044: if type-of-evidence = 1 or type-of-evidence-for-exceptions = 1 then evidence = oral. rule-def045: if type-of-evidence = 2 or type-of-evidence-for-exceptions = 2 then evidence = written. rule-def046: if type-of-evidence = 3 or type-of-evidence-for-exceptions = 3 then evidence = conduct. rule-def047: if type-of-evidence = 4 then evidence = non-assertion. 130 p *** PARX 3 EXCEPTION CONTROL RULES *** */ /* The following rules are the control rules which decide whether an exception applies and if so which one it is. */ /* EXCEPTION */ multivalued(exception). rule-excOOl: if declarant-dead and event-dying-declaration and perceive-dying-declaration and believe-dying-declaration and intend-dying-declaration and display([nl, T have determined that the dying declaration exception will apply in your case.', nl,nl]) then exception = dying-declaration. rule-exc002: if declarant-dead and not(event-dying-declaration) and event-against-interest and perceive-against-interest and believe-against-interest and intend-against-interest and display([nl, T have decided that the declarations against pecuniary or proprietary interest will apply in your case.', nl,nl]) then exception = against-interest. rule-exc003: if not(event-dying-declaration) and event-business-docs and perceive-business-docs-yes-no and believe-business-docs-yes and intend-business-docs-no-'BC-48' and display([nl, T have decided that your evidence is admissible under the business documents exception as implemented by section 48 of the British Columbia Evidence act.', nl,nl]) then exception = business-documentsl. rule-exc004: if not(event-dying-declaration) and event-business-docs and perceive-business-docs-no-no and believe-business-docs-no and intend-business-docs-yes-'Canada-30' and 131 display([nl, T have decided that your evidence is admissible under the business documents exception as implemented by section 30 of the Canada Evidence act.', nl,nl]) then exception = business-documents2. rule-exc005: if not(event-dying-declaration) and event-business-docs and perceive-business-docs-yes-yes and believe-business-docs-yes and intend-business-docs-no-none and display([nl, T have decided that your evidence is admissible under the common law business documents exception as set out in Ares v. Venner.', nl,nl]) then exception = business-documents3. rule-exc006: if declarant-dead and not(event-dying-declaration) and (evidence = oral or evidence = conduct) and event-duty and perceive-duty and believe-duty and intend-duty and display([nl, 'I have decided that your evidence is admissible under the declarations in course of duty exception.', nl, nl]) then exception = duty-oral-conduct. rule-exc007: if declarant-dead and not(event-dying-declaration) and evidence = written and event-duty and (perceive-business-docs-yes-no or perceive-business-docs-yes-yes or perceive-business-docs-no-no) and believe-duty and intend-duty and display([nl, T have decided that your evidence is admissible under the declarations in course of duty exception.', nl, nl]) then exception = duty-written. rule-exc008: if cached(exception is unknown) and display([nl, 'None of the exceptions that are available so far in this system apply in your case. Please be patient, we are adding material all the time.', nl,nl]) then exception = none. /*rule-exc008: if declarant-dead and cached(exception is unknown) and (event-dying-declaration and (not(perceive-dying-declaration) or not(believe-dying-declaration) or not(intend-dying-declaration))) or (event-duty and (not(perceive-duty) or not(believe-duty) or not(intend-duty))) or (event-against-interest and (not(perceive-against-interest) or not(believe-against-interest) or not(intend-against-interest))) and display([nl, 'In my opinion none of the exceptions that are so far available in this system apply in your case. Please be patient, we are adding material all the time.', nl,nl]) then exception = none.*/ /* *** PART 4 EXCEPTION RULES AND QUESTIONS *** */ /* The following rules and questions determine the information needed to satisfy the exception control rules in part 3 immediately above. Each exception contains four elements that must be satisfied in order for the exception to apply. These four elements are: event, perception, belief and intention. This matches a theory worked out by the authors of this knowledge base to explain the exceptions to the rule against hearsay. This theory is set out in the text of this LLM thesis. The rules which determine these four elements are set out immediately below followed by all the rules and questions which satisfy these elements arranged in alphabetical order.*/ /* EVENT */ rule-eventOOl: if sex = SEX and name = NAME and type-of-trial = criminal and declarant-dead and homicide(NAME) and declarant-dying and concerns-cause-of-death(NAME) then event-dying-declaration. rule-event002: if declarant-dead and in-course-of-duty then event-duty. rule-event003: if declarant-dead and against-interest and facts then event-against-interest. rule-event004: if evidence = written and sex = SEX and name = NAME and normal-function(SEX,NAME) then event-business-docs. /* PERCEIVE */ rule-perceiveOOl: if sex = SEX and name = NAME and event-dying-declaration and competent = yes and facts and personal-knowledge-dydec(SEX,NAME) = no then perceive-dying-declaration. rule-perceive002: if sex = SEX and name = NAME and event-dying-declaration and (competent = no or facts = no or personal-knowledge-dydec(SEX,NAME) = then perceive-not-dying-declaration. rule-perceive003: if sex = SEX and name = NAME and event-duty and personal-knowledge(SEX,NAME) and regular-part-of-job and experience is known and others-relied is known then perceive-duty. rule-perceive004: if sex = SEX and name = NAME and event-duty and (personal-knowledge(SEX,NAME) = no or regular-part-of-job = no) then perceive-not-duty. rule-perceive005: if sex = SEX and name = NAME and event-against-interest and facts and personal-knowledge(SEX,NAME) and loss-at-same-time(NAME) and declarant-knew(NAME) then perceive-against-interest. rule-perceive006: if sex = SEX and name = NAME and event-against-interest and (facts = no or personal-knowledge(SEX,NAME) = no or loss-at-same-time(NAME) = no or declarant-knew(NAME) = no) then perceive-not-against-interest. rule-perceive007: if sex=SEX and name=NAME and (NEED FTR == no or function-to-record(NAME)) and 135 answer-personal-knowledge-NEED PK and competent-to-testify(NAME) and relied-upon is known then perceive-business-docs-NEED FTR-NEED PK. /* BELIEVE */ rule-believeOOl: if sex = SEX and name = NAME and event-dying-declaration and declarant-dying then believe-dying-declaration. rule-believe003: if event-duty and contemporaneous then believe-duty. rule-believe004: if event-duty and contemporaneous = no then believe-not-duty. rule-believe005: if sex = SEX and name = NAME and loss-at-same-time(NAME) and declarant-knew(NAME) then believe-against-interest. rule-believe006: if sex = SEX and name = NAME and loss-at-same-time(NAME) = no or declarant-knew(NAME) = no then believe-not-against-interest. rule-believe007: if NEED C == no or contemporaneous then believe-business-docs-NEED C. /* INTEND */ rule-intendOOl: if sex = SEX and name = NAME and event-dying-declaration and (settled-hopeless-expectation or settled-hopeless-expectation = maybe) then intend-dying-declaration. 136 rule-intend002: if sex = SEX and name = NAME and event-dying-declaration and settled-hopeless-expectation = no then intend-not-dying-declaration. rule-intend003: if sex = SEX and name = NAME and event-duty and others-relied is known and regular-part-of-job is known and tampering is sought and not cached(tampering = yes) and motive-to-misrepresent(NAME) = no then intend-duty. rule-intend004: if sex = SEX and name = NAME and event-duty and (cached(tampering) or cached(motive-to-misrepresent(NAME))) then intend-not-duty. rule-intend005: if sex = SEX and name = NAME and event-against-interest and declarant-knew(NAME) and (collateral-part(NAME) = no or on-the-whole = injurious) then intend-against-interest. rule-intend006: if sex = SEX and name = NAME and event-against-interest and (declarant-knew(NAME) or collateral-part(NAME) = no or cached(on-the-whole = injurious)) then intend-not-against-interest. rule-intend007: if before-controversy and sex=SEX and name=NAME and answer-compiling-intent-NEED CI and (NEEDED EA == none or (evidence-act = EA and NEEDED EA == EA)) then intend-business-docs-NEED CI-NEEDED EA. 137 **** *i I* The following questions all pertain solely to the declarations against interest exception. */ /* AGAINST-INTEREST */ question(existing-legal-obligation(NAME)) = ['Does the statement acknowledge an existing legal obligation owed by ', NAME,'? (eg. an i.o.u. or a statement of partnership in an insolvent firm)']. legalvals(existing-legal-obligation(NAME)) = [yes, no]. automaticmenu(existing-legal-obligation(NAME)). question(forego-legal-benefit(NAME)) = ['Does the statement forego a legal benefit that \NAME,' might have sought? (eg. it forgives a debt)']. legalvals(forego-legal-benefit(NAME)) = [yes, no]. automaticmenu(forego-legal-benefit(NAME)). question(limited-interest-in-property(NAME)) = ['Does the statement acknowledge a limited interest in property under circumstances where it appears \NAME,' had a greater interest?']. legalvals(limited-interest-in-property(NAME)) = [yes, no]. automaticmenu(limited-interest-in-property(NAME)). question(expose-to-liability(NAME)) = ['Could the statement expose \NAME,' liability in a possible legal action?']. legalvals(expose-to-liability(NAME)) = [yes, no]. automaticmenu(expose-to-liability(NAME)). question(other-financial-loss(NAME)) = ['Could the statement result in any other kind of financial loss to '.NAME,'?']. legalvals(other-financial-loss(NAME)) = [yes, no]. automaticmenu(other-financial-loss(NAME)). rule-againstintOOl: if sex = SEX and name = NAME and (existing-legal-obligation(NAME)) or (forego-legal-benefit(NAME)) or (limited-interest-in-property(NAME)) or (expose-to-liability(NAME)) or (other-financial-loss(NAME)) then against-interest. 138 rule-againstint002: if sex = SEX and name = NAME and (existing-legal-obligation(NAME)) = no and (forego-legal-benefit(NAME)) = no and (limited-interest-in-property(NAME)) = no and (expose-to-liability(NAME)) = no and (other-financial-loss(NAME)) = no then against-interest = no. /* COLLATERAL-PART */ question(collateral-part(NAME)) = [Ts part of the statement actually of benefit to \NAME,'?']. legalvals(collateral-part(NAME)) = [yes.no]. automaticmenu(collateral-part(NAME)). /* DECLARANT-KNEW */ question(declarant-knew(NAME)) = ['Did '.NAME, ' know that the statement could result in this loss?']. legalvals(declarant-knew(NAME)) = [yes,no]. automaticmenu(declarant-knew(NAME)). /* LOSS-AT-SAME-TIME */ question(loss-at-same-time(NAME)) = ['Would the facts stated actually have caused a loss to \NAME,' at the time the statement was made?']. legalvals(loss-at-same-time(NAME)) = [yes,no]. automaticmenu(loss-at-same-time(NAME)). /* ON-THE-WHOLE - */ question(on-the-whole) = ' On the whole is the statement more beneficial to the declarant or more injurious?'. legalvals(on-the-whole) = [beneficial.injurious]. automaticmenu(on-the-whole). 139 I* **** * j I* The following questions all pertain solely to the business documents exceptions. */ /* ANSWER-COMPILING-INTENT */ rule-aciOOl: if sex=SEX and name=NAME and (NEED CI == no or compiling-intent(SEX,NAME)) then answer-compiling-intent-NEED CI. /* BEFORE-CONTROVERSY */ question(bef ore-controversy) = 'Was the statement made before the controversy arose?'. legalvals(bef ore-controversy) = [yes,no]. automaticmenu(bef ore-controversy). /* COMPETENT-TO-TESTIFY */ question(competent-to-testify(NAME)) = ['Were \NAME,' and any other persons who passed on information competent to testify?']. legalvals(competent-to-testify(NAME)) = [yes.no]. automaticmenu(competent-to-testify(NAME)). /* COMPILING-INTENT */ question(compiling-intent([FIRSTP|_|,NAME)) = ['Was '.NAME,' simply compiling the record for regular business purposes or was '.FIRSTP,' actually conducting an investigation into historical facts?']. legalvals(compiling-intent(SEX,NAME)) = ['business purposes'.investigation]. automaticmenu(compiling-intent(SEX,NAME)). enumeratedanswers(compiling-intent(SEX,NAME)). /* DECLARANT-INFORMED */ question(declarant-inf ormed(NAME)) = ['Was '.NAME,' informed by one or more persons?']. legalvals(declarant-informed(NAME)) = [yes,no]. automaticmenu(declarant-informed(NAME)). 140 /* EVIDENCE-ACT */ question(ask-jurisdiction) = 'Does the action you wish to use this evidence for fall under federal or provincial (B.C.) jurisdication?'. legalvals(ask-jurisdiction)=[federal,provincial]. enumeratedanswers(ask-jurisdiction). automaticmenu(ask-jurisdiction). rule-jurisOOl: if ask-jurisdiction = JURISDICTION and JURISDICTION == federal then evidence-act = 'Canada-30'. rule-juris002: if JURISDICTION == provincial then evidence-act = 'BC-48'. /* FUNCTION-TO-INFORM- */ question(function-to-inform) = 'Was it part of those persons normal function to pass on the information?'. legalvals(function-to-inform) = [yes,no]. automaticmenu(function-to-inform). /* FUNCTION-TO-RECORD */ question(function-to-record(NAME)) = ['Was it part of ',NAME,"'s function in the organization to record the information in the statement?']. legalvals(function-to-record(NAME)) = [yes,no]. automaticmenu(function-to-record(NAME)). /* ANSWER-PERSONAL-KNOWLEDGE */ rule-apkOOl: if (sex = SEX and name = NAME and personal-knowledge(SEX,NAME)) or (NEED PK == no and name = NAME and declarant-informed(NAME) and function-to-inform) then answer-personal-knowledge-NEED PK. 141 /* RELIED-UPON */ question(relied-upon) = ['Was the statement relied upon in the day to day affairs of this (or another business)?']. legalvals(relied-upon) = [yes.no]. automaticmenu(relied-upon). /* NORMAL-FUNCTION */ question(normal-function([FIRSTP|_J,NAME)) = ['Was the statement made as a normal part of the function of '.NAME,' in the business organization with which '.FIRSTP,' is connected?']. legalvals(normal-function(SEX,NAME)) = [yes.no]. automaticmenu(normal-function(SEX,NAME)). I* **** * j I* The following questions all pertain solely to the dying declarations exception. */ /* CONCERNS-CAUSE-OF-DEATH */ presupposition(concerns-cause-of-death(NAME)) = (declarant-dying = yes). question(concerns-cause-of-death(NAME)) = ['Does the statement identify the person who caused \NAME,"'s death?']. legalvals(concerns-cause-of-death(NAME)) = [yes.no]. automaticmenu(concerns-cause-of-death(NAME)). /* DECLARANT-DYING */ question(declarant-dying) = 'Was the statement made after the injuries occurred?'. legalvals(declarant-dying) = [yes.no]. automaticmenu(declarant-dying). /* EVIDENCE-TO-SHOW-PERCEIVED */ presupposition(evidence-to-show-perceived(NAME)) = event-dying-declaration. question(evidence-to-show-perceived(NAME)) = ['Do the circumstances indicate that '.NAME,' actually saw what happened?']. legalvals(evidence-to-show-perceived(NAME)) = [yes,no]. automaticmenu(evidence-to-show-perceived(NAME)). 142 /* HOMICIDE */ question(homicide(NAME)) = ['Was the death of \NAME,' the subject of the charge?']. legalvals(homicide(NAME)) = [yes,no]. automaticmenu(homicide(NAME)). /* SETTLED-HOPELESS-EXPECTATION */ question(serious-injuries) = ['Into which category would you place the injuries: 1. fatal wound(s) with heavy external bleeding 2. fatal wound(s) with little external bleeding 3. all other injuries - eg. drowning, poisoning, wounds from a fist fight, complications from an illegal operation']. legalvals(serious-injuries) = number. question(declarant-belief([FIRSTP|_],NAME)) = ['Did '.NAME/ believe \FIRSTP,' was dying?']. legalvals(declarant-belief(SEX,NAME)) = [yes, no]. automaticmenu(declarant-belief(SEX,NAME)). question(cancelled([FIRSTP|_J,NAME)) = ['Did \NAME,' qualify this belief in any way so as to indicate that it was not an absolute belief? For example, ',FIRSTP,' said, "I think I am dying.'"]. legalvals(cancelled(SEX,NAME)) = [yes, no]. automaticmenu(cancelled(SEX,NAME)). question(declarant-told-dying([FIRSTP,THIRDP|_J)) = ['Did a doctor tell \THIRDP,' that \FIRSTP,' was dying?']. legalvals(declarant-told-dying(SEX)) = [yes,no]. automaticmenu(declarant-told-dying(SEX)). presupposition(acknowledge(SEX)) = (declarant-told-dying(SEX) = yes). question(acknowledge([FIRSTP|THIRDP])) = ['Did \FIRSTP,' acknowledge this (a mere nod would be sufficient)?']. legalvals(acknowledge(SEX)) = [yes,no]. automaticmenu(acknowledge(SEX)). question(bystanders-belief([FIRSTP,THIRDPLJ)) = ['Did people around \THIRDP,' believe ',FIRSTP,' was dying?']. legalvals(bystanders-belief(SEX)) = [yes,no]. automaticmenu(bystanders-belief(SEX)). rule-she004: if sex = SEX and name = NAME and serious-injuries is known and (declarant-belief(SEX,NAME) = yes and cancelled(SEX,NAME) = no) and (declarant-told-dying(SEX) = yes and acknowledge(SEX) is known) and bystanders-belief(SEX) = yes then very-strong-case. rule-she005: if sex = SEX and name = NAME and serious-injuries is known and (declarant-belief(SEX,NAME) = yes and cancelled(SEX.NAME) = no) and declarant-told-dying(SEX) = no and bystanders-belief(SEX) = yes then strong-case. rule-she006: if sex = SEX and name = NAME and serious-injuries is known and declarant-belief(SEX,NAME) = no and declarant-told-dying(SEX) = yes and acknowledge(SEX) is known and bystanders-belief(SEX) is known then strong-case. rule-she007: if sex = SEX and name = NAME and serious-injuries = 1 and declarant-belief(SEX,NAME) = no and declarant-told-dying(SEX) = no and bystanders-belief(SEX) is known then strong-case. rule-she008: if sex = SEX and name = NAME and serious-injuries = 2 or serious-injuries = 3 and declarant-belief(SEX,NAME) = no and declarant-told-dying(SEX) = no and bystanders-belief(SEX) is known then very-weak-case. rule-she009: if sex = SEX and name = NAME and serious-injuries = 2 or serious-injuries = 3 and declarant-belief(SEX.NAME) = yes and cancelled(SEX,NAME) = yes and declarant-told-dying(SEX) = yes and acknowledge(SEX) is known and bystanders-belief(SEX) is known then weak-case. rule-sheOlO: if sex = SEX and name = NAME and (serious-injuries = 2 or serious-injuries = 3) and (declarant-belief(SEX,NAME) = no or (declarant-belief(SEX,NAME) = yes and cancelled(SEX,NAME) = yes)) and declarant-told-dying(SEX) = no and bystanders-belief(SEX) is known then no-case. /*rule-she004: if sex = SEX and name = NAME and serious-injuries(NAME) = yes and (declarant-belief(SEX,NAME) = 1 or declarant-belief(SEX,NAME) = 2 or declarant-belief(SEX,NAME) = 3 or declarant-belief(SEX.NAME) = 4) and (declarant-told-dying(SEX) = no or (declarant-told-dying(SEX) = yes and acknowledge(SEX) is known)) and bystanders-belief(SEX) is known then very-strong-case. rule-she005: if sex = SEX and name = NAME and serious-injuries(NAME) = no and (declarant-belief(SEX,NAME) = 1 or declarant-belief(SEX,NAME) = 4) and declarant-toId-dying(SEX) = no and bystanders-belief(SEX) is known then weak-case. rule-she006: if sex = SEX and name = NAME and serious-injuries(NAME) = yes and declarant-belief(SEX,NAME) is known and 145 declarant-told-dying(SEX) = yes and acknowledge(SEX) = yes and bystanders-belief(SEX) is known then very-strong-case. rule-she007: if sex = SEX and name = NAME and serious-injuries(NAME) = no and declarant-belief(SEX,NAME) = 1 and declarant-told-dying(SEX) = yes and acknowledge(SEX) = yes and bystanders-belief(SEX) is known then strong-case. rule-she008: if sex = SEX and name = NAME and serious-injuries(NAME) = no and declarant-belief(SEX,NAME) = 5 and declarant-told-dying(SEX) = no and bystanders-belief(SEX) = yes then very-weak-case. rule-she009: if sex = SEX and name = NAME and serious-injuries(NAME) = no and declarant-belief(SEX.NAME) = 5 and declarant-told-dying(SEX) = no and bystanders-belief(SEX) = no then no-case.*/ rule-sheOl 1: if (no-case or very-weak-case) and display([nl, 'These last questions have helped me determine that the declarant did not have a settled, hopeless expectation of death when the statement was made. Case law requires that this be established in order for the dying declaration exception to apply.', nl,nl]) then settled-hopeless-expectation = no. rule-she012: if weak-case and display([nl, 'These last questions have helped me decide that the declarant may have had a settled, hopeless expectation of death when the statement was made. Case law requires that this be established in order for the dying declaration exception to apply. You should regard this as a weak link in your case and try to bolster it.', nl,nl]) then settled-hopeless-expectation = maybe. 146 rule-she013: if (strong-case or very-strong-case) and display([nl, 'These last questions have helped me decide that the declarant had a settled, hopeless expectation of death when the statement was made. Case law requires that this be shown in order for the dying declaration exception to apply.', nl,nl]) then settled-hopeless-expectation = yes. I* **** *j I* The following questions apply only to the declarations in course of duty exception. */ /* EXPERIENCE */ presupposition(past-experience(SEX,NAME)) = (regular-part-of-job = yes). question(past-experience([FIRSTP|_],NAME)) = ['Was this the first time '.NAME,' did this or did \FIRSTP,' have experience of this sort?']. legalvals(past-experience(SEX,NAME)) = [first-time, experienced]. automaticmenu(past-experience(SEX,NAME)). enumeratedanswers(past-experience(SEX,NAME)). rule-pastOOl: if sex = SEX and name = NAME and past-experience(SEX,NAME) = first-time then experience = no. rule-past002: if sex = SEX and name = NAME and past-experience(SEX,NAME) = experienced then experience = yes. /* IN-COURSE-OF-DUTY */ question(in-course-of-duty) = 'Was the statement made as part of the employment of the declarant (include independent contractors)?'. legalvals(in-course-of-duty) = [yes,no]. automaticmenu(in-course-of-duty). 147 /* MOTIVE-TO-MISREPRESSENT */ presupposition(motive-to-misrepresent(NAME)) = believe-duty. question(motive-to-misrepresent(NAME)) = [Ts there any indication that \NAME,' could have had a motive to misrepresent the things reported?']. legalvals(motive-to-misrepresent(NAME)) = [yes,no]. automaticmenu(motive-to-misrepresent(NAME)). /* TAMPERING */ presupposition(tampering) = believe-duty. presupposition(tampering) = (evidence = written). question(tampering) = 'Is there any indication that the record was tampered with?'. legalvals(tampering) = [yes,no]. automaticmenu(tampering). /* REGULAR-PART-OF-JOB */ presupposition(duty-to-perform(SEX,NAME)) = (personal-knowledge(SEX,NAME) = yes). question(duty-to-perform(SEX,NAME)) = ['Did the statement report activities that were a regular part of the \NAME,"'s job?']. legalvals(duty-to-perform(SEX,NAME)) = [yes,no]. automaticmenu(duty-to-perform(SEX,NAME)). question(duty-to-observe(NAME)) = ['Did \NAME,' have a duty to observe the things reported in the statement?']. legalvals(duty-to-observe(NAME)) = [yes,no]. automaticmenu(duty-to-observe(NAME)). question(regular-part-of-job-statement(NAME)) = ['Was the making of the statement a regular part of \NAME,"'s job?']. legalvals(regular-part-of-job-statement(NAME)) = [yes,no]. automaticmenu(regu!ar-part-of-job-statement(NAME)). 148 rule-jobOOl: if sex = SEX and name = NAME and (duty-to-perform(SEX,NAME) or duty-to-observe(NAME)) and regular-part-of-job-statement(NAME) then regular-part-of-job. rule-job002: if sex = SEX and name = NAME and ((duty-to-perform(SEX,NAME) = no and duty-to-observe(NAME) = no) or regular-part-of-job-statement(NAME) = no) then regular-part-of-job = no. /* OTHERS-RELIED */ presupposition(others-relied) = experience is known. question(others-relied) = 'Did other people rely on the statement and take action?'. legalvals(others-relied) = [yes,no]. automaticmenu(others-relied). j* **** *j I* The following questions all pertain to more than one exception. */ /* COMPETENT */ question(competent([FIRSTP|_|,NAME)) = ['If '.NAME,' were called in court to testify would '.FIRSTP,' not be competent to testify for any of the following reasons? 1. incompetent due to mental deficiencies 2. may be incompetent because child under 14 years of age 3. not incompetent']. legalvals(competent(SEX,NAME)) = number. rule-compOOl: if sex = SEX and name = NAME and competent(SEX.NAME) = 1 or (competent(SEX.NAME) = 2 and appreciates-oath(NAME) = no) then competent = no. 149 rule-comp002: if sex = SEX and name = NAME and competent(SEX,NAME) = 3 or (competent(SEX,NAME) = 2 and appreciates-oath(NAME) = yes) then competent. question(appreciates-oath(NAME)) = ['Would '.NAME,' have been able to appreciate the nature and quality of an oath?']. legalvals(appreciates-oath(NAME)) = [yes, no]. automaticmenu(appreciates-oath(NAME)). /* CONTEMPORANEOUS */ question(contemporaneous) = 'Was the report made contemporaneously with the things reported?'. legalvals(contemporaneous) = [yes.no]. automaticmenu(contemporaneous). /* DECLARANT-NOT-AVAILABLE */ presupposition(declarant-not-available(SEX,NAME)) = (proposition = yes). presupposition(declarant-not-available(SEX,NAME)) = (declarant-testify = no). question(declarant-not-available([FIRSTP|_J,NAME)) = ['Earlier in this consultation, you told me that '.NAME,' is not available to testify in court. Why is \FIRSTP,' unavailable? 1. dead 2. other reason']. 1*2. unfit to testify 3. refusing to testify in spite of court order 4. absent from the jurisdiction 5. unknown'].*/ legalvals(declarant-not-available(SEX,NAME)) = number. presupposition(testimony-unavailable(SEX)) = (proposition = no). presupposition(testimony-unavailable(SEX)) = (declarant-testify = no). question(testimony-unavailable([FIRSTP|_J)) = ['Why is \FIRSTP,' unavailable? 1. dead 2. other']. 1*2. unfit to testify 3. refusing to testify in spite of court order 4. absent from the jurisdiction 5. unknown'].*/ legalvals(testimony-unavailable(SEX)) = number. rule-dai006: if sex = SEX and name = NAME and (declarant-not-available(SEX,NAME) = 1 or testimony-unavailable(SEX) = 1) then declarant-dead. rule-dai007: if sex = SEX and name = NAME and (declarant-not-available(SEX,NAME) = 2 or testimony-unavailable(SEX) =2) then declarant-not-dead. /*rule-dai007: if sex = SEX and name = NAME and (declarant-not-available(SEX,NAME) = 2 or testimony-unavailable(SEX) = 2) then declarant-unfit-to-testify. rule-dai008: if sex = SEX and name = NAME and (declarant-not-available(SEX,NAME) = 3 or testimony-unavailable(SEX) = 3) then declarant-refuses-to-testify. rule-dai009: if sex = SEX and name = NAME and (declarant-not-available(SEX,NAME) = 4 or testimony-unavailable(SEX) = 4) then declarant-absent-from-jurisdiction. rule-daiOlO: if sex = SEX and name = NAME and (declarant-not-available(SEX,NAME) = 5 or testimony-unavailable(SEX) = 5) then declarant-status-not-known.*/ 151 /* FACTS */ question(facts) = 'Does the statement reflect facts (as opposed to being opinion)?'. legalvals(facts) = [yes,no]. automaticmenu(facts). /* TYPE-OF-TRIAL */ question(type-of-trial) = 'What type of proceedings are you involved in?'. legalvals(type-of-trial) = [civil,criminal]. automaticmenu(type-of-trial). enumeratedanswers(type-of-trial). /* IDENTIFICATION */ /*question(identification) = 'Do you wish to use the statement because it identifies someone?'. legalvals(identification) = [yes,no]. automaticmenu(identif ication). */ /* PERSONAL-KNOWLEDGE */ question(personal-knowledge-dydec([FIRSTP| ],NAME)) = [Ts there anything to suggest that '.NAME,' might not actually have seen what happened (eg. \FIRSTP,' was shot in the back)?']. legalvals(personal-knowledge-dydec(SEX,NAME)) = [yes,no]. automaticmenu(personal-knowledge-dydec(SEX,NAME)). question(personal-knowledge([_|[THIRDP|_J],NAME)) = ['Did '.NAME,' '.THIRDP.'self actually perform or observe the actions reported in the statement?']. legalvals(personal-knowledge(SEX,NAME)) = [yes.no]. automaticmenu(personal-knowledge(SEX,NAME)). /* END OF KNOWLEDGE BASE */ 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.831.1-0077711/manifest

Comment

Related Items