Top Banner
E LAW | Murdoch University Electronic Journal of Law - Copyright Policy LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE OF MECHANICAL LEGAL INFERENCE Author: Andrew Greinke University of Queensland Subjects: Artificial intelligence Expert systems computer science Issue: Volume 1, Number 4 (December 1994) Category: Refereed Articles "Suppose I am in a closed room and that people are passing in to me a series of cards written in Chinese, a language of which I have no knowledge; but I do possess rules for correlating one set of squiggles with another set of squiggles so that when I pass the appropriate card back out of the room it will look to a Chinese observer as if I am a genuine user of the Chinese language. But I am not; I simply do not understand Chinese; those squiggles remain just squiggles to me." [*] 1: INTRODUCTION Computerisation of the legal office is an ongoing process. The range of non-legal applications now in common use include word processing, accounting, time costing, communication and administration systems. [1] More recently it has been demonstrated that computers can be used as research tools, particularly in the retrieval of primary legal materials. Prominent examples include the LEXIS, SCALE and INFO1 databases, now familiar to many practitioners. [2] Some moves have also been made towards "conceptual" text retrieval systems. [3] Flushed with successes in projects such as DENDRAL, [4] PROSPECTOR, [5], and MYCIN, [6] computer scientists have now turned to law in order that they might "widen their range of conquests". [7] The interaction between computers and the law has now spawned a large and disparate discipline, boasting eight centres for Law and Informatics in Europe, as well as growing numbers of similar centres in North America, Japan and Australia. The "resource" of expert legal knowledge, "often transitory, even volatile in nature" is seen worthy of nurture and preservation. The use of legal expert systems is seen capable of preserving indefinitely and placing at the disposal of others the wealth of legal knowledge and expertise. [8] The idea is not new, being anticipated by writers such as Loevinger [9] and Mehl [10] as early as 1949. Yet lawyers have generally greeted "legal expert systems" - seen by some as the natural progression in the use of computers - with apathy, ignorance or resistance. [11] This article argues that such opposition is justified when proper regard is had to the implications arising from the computational foundation for such systems. It is necessary for the following analysis to clearly distinguish two fundamentally distinct classes of computer applications to law: decision support systems, and expert systems. [12] "Decision support systems" are powerful research tools or "intelligent assistants" designed to support decisions taken and advice given by human experts. "Legal expert systems" are designed to make decisions and provide advice as would a human expert. Richard Susskind, a British researcher whose work [13] constitutes the major theoretical grounding of legal expert systems, states: "Expert systems are computer programs that have been constructed (with the assistance of human E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html 1 of 37 19.12.08 11:59
37

e Law_ Legal Expert Systems_no Way..

Jan 17, 2016

Download

Documents

vop63

e Law_ Legal Expert Systems_no Way..
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: e Law_ Legal Expert Systems_no Way..

E LAW | Murdoch University Electronic Journal of Law - Copyright Policy

LEGAL EXPERT SYSTEMS - A HUMANISTICCRITIQUE OF MECHANICAL LEGAL INFERENCEAuthor: Andrew Greinke

University of QueenslandSubjects: Artificial intelligence

Expert systems computer scienceIssue: Volume 1, Number 4 (December 1994)Category: Refereed Articles

"Suppose I am in a closed room and that people are passing in to me a series of cards written in Chinese,a language of which I have no knowledge; but I do possess rules for correlating one set of squiggleswith another set of squiggles so that when I pass the appropriate card back out of the room it will look toa Chinese observer as if I am a genuine user of the Chinese language. But I am not; I simply do notunderstand Chinese; those squiggles remain just squiggles to me." [*]

1: INTRODUCTION Computerisation of the legal office is an ongoing process. The range of non-legal applications now incommon use include word processing, accounting, time costing, communication and administrationsystems. [1] More recently it has been demonstrated that computers can be used as research tools,particularly in the retrieval of primary legal materials. Prominent examples include the LEXIS, SCALEand INFO1 databases, now familiar to many practitioners. [2] Some moves have also been madetowards "conceptual" text retrieval systems. [3] Flushed with successes in projects such as DENDRAL, [4] PROSPECTOR, [5], and MYCIN, [6]computer scientists have now turned to law in order that they might "widen their range of conquests". [7] The interaction between computers and the law has now spawned a large and disparate discipline,boasting eight centres for Law and Informatics in Europe, as well as growing numbers of similar centresin North America, Japan and Australia. The "resource" of expert legal knowledge, "often transitory,even volatile in nature" is seen worthy of nurture and preservation. The use of legal expert systems isseen capable of preserving indefinitely and placing at the disposal of others the wealth of legalknowledge and expertise. [8] The idea is not new, being anticipated by writers such as Loevinger [9] andMehl [10] as early as 1949. Yet lawyers have generally greeted "legal expert systems" - seen by some as the natural progression inthe use of computers - with apathy, ignorance or resistance. [11] This article argues that such oppositionis justified when proper regard is had to the implications arising from the computational foundation forsuch systems. It is necessary for the following analysis to clearly distinguish two fundamentally distinct classes ofcomputer applications to law: decision support systems, and expert systems. [12] "Decision support systems" are powerful research tools or "intelligent assistants" designed to support decisions taken andadvice given by human experts. "Legal expert systems" are designed to make decisions and provideadvice as would a human expert. Richard Susskind, a British researcher whose work [13] constitutes themajor theoretical grounding of legal expert systems, states: "Expert systems are computer programs that have been constructed (with the assistance of human

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

1 of 37 19.12.08 11:59

Page 2: e Law_ Legal Expert Systems_no Way..

experts) in such a way that they are capable of functioning at the standard of (and sometimes even at ahigher standard than) human experts in given fields . . . that embody a depth and richness of knowledgethat permit them to perform at the level of an expert."[14] Legal expert systems are a type of knowledge based technology. With the explosion of applications,"expert system" is quickly becoming an imprecise term. [15] The definition used by Feigenbaum will be acceptable for the type of systems examined in this article: "An intelligent computer program that uses knowledge and inference procedures to solve problems thatare difficult enough to require significant human expertise for their solution. Knowledge necessary toperform at such a level, plus the inference procedures used, can be thought of as a model of the expertiseof the best practitioners of the field."[16] In terms of programming technology, the knowledge based approach has been described as an"evolutionary change with revolutionary consequences", [17] replacing the tradition of data + algorithm = program with a new architecture centred around a "knowledge base" and an "inference engine" so that: knowledge + inference = expert system. In fact, there are four essential components to a fully functional expert system: 1. the knowledgeacquisition module; 2. the knowledge base; 3. the inference engine; and 4. the user interface. Knowledge acquisition is the process of extracting knowledge from experts. Given the difficultyinvolved in having experts articulate their "intuition" in terms of a systematic process of reasoning, this aspect is regarded as the main "bottleneck" [18] in expert systems development. The knowledge basestores information about the subject domain. However, this goes further than a passive collection of records in a database. Rather it contains symbolic representations of experts' knowledge, includingdefinitions of domain terms, interconnections of component entities, and cause-effect relationshipsbetween these components. In legal expert systems this usually consists of formalised legal rules obtainedfrom primary and secondary sources of law. Another layer of rules may also be obtained from lessformal knowledge not found in published literature, [19] such as "practitioner's hand books and internal memoranda within legal practices". [20] These heuristics [21] add "experiential" to "academic"knowledge. [22] An inference engine consists of search and reasoning procedures to enable the system to find solutions,and, if necessary, provide justifications for its answers. The nature of this inference process is describedin detail in Section 2. The user interface is critical to the commercial success of expert systems,particularly in the legal field, to enable lawyers with little or no expertise in programming, to gain accessto the encoded knowledge. Typically this is in the form of prompting for information, and askingquestions with "yes", "no" and "unknown" responses. Artificial intelligence, the foundation for legal expert systems, has run up against both practical andtheoretical difficulties. While computers can beat the average human at "clever" tasks such as playingchess, they are "impossibly stupid" over tasks taken for granted such as speaking a language or walkingacross a room. [23] This casts doubt on whether many human activities do, as some artificial intelligenceresearchers suggest, consist of suppressed computational algorithms. There has been severe criticism bythose who claim that knowledge, by its very nature, is not amenable to representation on a computer,[24] or that they achieve no more than simple competency. [25] Early misconceptions about the easewith which powerful and knowledgeable systems could be built for use by relative novices have givenway to concern about real problems of knowledge elicitation and knowledge modelling. [26] Someopponents are convinced that the claims of artificial intelligence are exaggerated and their objectives

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

2 of 37 19.12.08 11:59

Page 3: e Law_ Legal Expert Systems_no Way..

unreachable. [27] Section 2 examines the nature of the inference engine, and suggests that its deductive procedures rest inpattern matching routines. It also explores issues of "fuzzy" and "deontic" logic. Section 3 explores theimplications for knowledge representation, and questions whether devices such as "semantic networks","frames" and "case based reasoning" are anything more than elaborate pattern matching constructs. Section 4 demonstrates that the need to be amenable to a deductive inference engine involvesunacceptable distortion of law both at a practical and theoretical level. Section 5 argues that legalreasoning necessarily involves resort to social context and purpose, which is not tractable within currenttechnology. Section 6 suggests that researchers ought to abandon legal expert systems, and insteadconcentrate on computerising more mechanical tasks such as legal retrieval and litigation support. Asummary and conclusion is contained in Section 7. 2: PATTERN MATCHING AS THE CORE OF AN EXPERT SYSTEM The core of any expert legal system is its inference engine. This Section investigates the nature ofcomputer inference at a basic level, and argues that it is little more than a pattern matching exercise. It isalso argued that more sophisticated approaches, such as fuzzy logic, and deontic logic, are no more thanextensions of the same principles. 2.1 The Nature of Computer Inference Computer inference is undertaken by a simple strategy known as modus ponens. This means that thefollowing syllogism is assumed to be correct: A is true (fact) If A is true then B is true (rule) \ B is true (conclusion) Computer deduction is obtained by conditioning the consecutive execution of instructions on matching,or failing to match, values in storage registers. The identical syllogism is obtained by a computer using aroutine in the following terms: 1. check the value of register X1 2. compare the value of X1 to a value in register A 3. if X1 = Athen: 4. change the value of register X2 to the value in register B The computer is conditioned by the value placed in register X1, either by the user or by satisfaction ofsome prior rule. A is taken to be true if the value of X1 is equal to a particular value in register A,representing some real world condition. The rule "if A then B" is contained implicitly within the structureof the routine by conditioning step 47 on the satisfaction of the condition X1 = A (i.e. A is true). Themodus ponens is completed by step 47 which alters another register to equal a value B, thereby assertingthat B is true. A programmer in BASIC or PASCAL, for instance, has some relationship in mind between the datasupplied to the program and the output to be produced by computation. The input data are stored in themachine's memory, and the programmer's task is to devise a sequence of instructions to manipulate thedata in accordance with the relationships she envisages. In such a case the inference engine constitutesthe implicit algorithm contained within the sequence of instructions. Examples of such algorithmicknowledge based systems include Hellawell's CORPTAX, [28] CHOOSE, [29] and SEARCH, [30] all implemented in BASIC. 2.2 Logic Programming Normal programming can at best maintain logical relationships implicitly within the program's structure. A number of researchers are now involved in logic programming, [31] which has been seen as the realtechnical breakthrough in this field. Some have extended logic programming to the point of writing

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

3 of 37 19.12.08 11:59

Page 4: e Law_ Legal Expert Systems_no Way..

expert systems by means of another logic based application, such as DARWIN. [32] Logic programmingallows the programmer to specify logical relationships, not in terms of sequential instructions, but in termsof some symbolic language. [33] It is then up to the machine to compile the set of sequential instructionsto maintain the desired relationship. A logic programming system can be regarded as a kind ofrule-based system where the inference engine becomes a "mechanical theorem prover", [34] a machinefor answering questions of the form: Do axioms (A0 . . . An) logically imply B ? The claimed advantages of rule-based logic systems over conventional programs are perspicuity andmodularity. [35] Perspicuity is obtained by separating the rules (the knowledge base) from the logicaloperators (the inference engine). This has important implications for system maintenance and debugging. Modularity exists since the knowledge is split into small and independent rules. Legal rules, written in symbolic language, are manipulated through a process of "forward" and"backward chaining". A set of IF-THEN rules, constituting a "search space", [36] are compared against aset of facts to reach a logical conclusion. [37] In an expert system forward chaining simply involvesmatching the IF conditions to the facts, according to a predetermined order, which under the rules, dictate a conclusion. [38] Susskind describes this as a "control strategy" which "triggers" and "fires" therules. [39] Backward chaining starts with the legal conclusion and searches for justifying antecedents in theknowledge base. In terms of programming this technique is more difficult since the search of the knowledge base is not along a single "path" but involves identification of all possible rules leading to therequired conclusion. [40] In essence, it matches THEN variables with their IF antecedents and compilesa list of the paths thus generated. A "goal driven" expert system predetermines a conclusion and identifies the legal arguments andreasoning that can be used in support of that conclusion. [41] In logic programming terms this is no morethan backward chaining across the search space. These processes of forward and backward chainingform the core of expert systems inference procedure. One notable feature of logic programming is the Horn clause, seen as a suitable extension to the "simple"predicate logic already outlined. It is of the form: A if B0 and . . . Bn where n >= 0. which consists of a single conclusion, A, and any number of conditions (B0, B1, . . .Bn). For example,the Horn clause: X is the father of Y if X is a parent of Y and X is male is a Horn clause with one conclusion and two conditions. Factual premises, such as "X is male" can beexpressed as a Horn clause with one conclusion and no conditions. The significance of such a clause isthat symbolic logic is not limited to IF-THEN statements, but may be extended to IF-AND-NOT-THENstatements. In terms of actual programming, however, the Horn clause is implemented as a bundle of IF-THEN assertions; each condition being checked separately for a pattern match, and the routine haltingon failure to match. The pattern matching approach of logic programming is not relaxed but in facttightened by the use of "integrity constraints", such as IF-THEN-ELSE structures, to close off the potential for negation by failure [42] and counterfactual conditional [43] difficulties. Constructed on the basis of Horn clauses, PROLOG and variations have been the platform for most logicprogramming projects, both in logic and procedure. APES, [44] implemented in PROLOG, is one widelyused augmentation. [45] Using PROLOG as a symbolic logic structure involves rendering the domain

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

4 of 37 19.12.08 11:59

Page 5: e Law_ Legal Expert Systems_no Way..

knowledge in terms of Horn clauses, rewriting them in PROLOG syntax, and then executing the result asa program. PROLOG may itself provide a procedural basis for expert system platforms. Horn clausesmay be backward chained as a procedure, working from conclusions to conditions and, as a sub-task, pattern matching each against its knowledge base, or user input. The program statements can thereby mixconditions which express legal rules with procedures to prompt the user for additional information. Anexample is Schlobohm's system to determine "constructive ownership of stock" under United Statesrevenue laws. [46] The LEX [47] project is a more sophisticated application of the same principles. 2.3 Fuzzy Logic Fuzzy logic is an attempt to escape the perceived inadequacy of binary logic. [48] Zadeh introduced theconcept of the fuzzy set [49] to provide a formal way of speaking about imprecise concepts, such as"large" and "small". Rather than requiring precise values to be attached to particular characteristics, aspectrum of values, broken into categories, is used to match concepts, analogous to concepts in cognitivepsychology. [50] The object of fuzzy logic is to convert continuous measurements into approximatediscrete values. For example, a rule of the form: A PERSON IS A MINOR IF UNDER 18 YEARS OF AGE can be rendered by the following simple routine: 1. check register AGE 2. if AGE < 18 then: 3. set register PERSON to value 123 where 123 represents "minor". The spectrum of values "less than 18" is the fuzzy category. On a morecomplex level, matching can take place not only between ranges of values, but fuzzy sets. In binary logic, two concepts will be identical if and only if their membership functions, that is, their definingcharacteristics, exactly coincide. For instance, if F is a class of subsets of X, a set of characteristicsdefining legal concepts, then for Y and Z: Y is identical to Z iff fY(x) = fZ(x) for all x Rather than matching "identical" sets, fuzzy logic matches "closely identical" or "sufficiently close" sets.[51] To ascertain "closeness", a probabilistic metric is constructed. For example D(Y,Z) = Integral [{fy(x) - fz(x)}^2.p(x).dx] where p(x) is some probability distribution on X. D(Y,Z) is therefore a metric that depends on the choiceof p(x). Using these definitions, one can test "closely identical" by inferring that: Y is identical to Z iff (1 - d(Y,Z)) >= d where d is some arbitrary threshold which can itself be used to trigger the operation of a rule. "Fuzzy logic" is therefore one way of rendering continuous or approximate concepts into terms amenableto computer deduction. It is, however, no more than an extension to logic programming techniques. Critics suggest that fuzzy logic is no more than oversophistication of arbitrary approximation; that itsappearance of precision is spurious, and that its philosophical basis is uncertain when applied to legalconcepts. [52] 2.4 Deontic Logic In legal expert systems the nature of law as a normative system, [53] has given rise to a perceivednecessity for incorporation of deontic logic. [54] Whereas traditional and classical logics provide formal canons for reasoning with empirical statements that have truth value, deontic logics provide standards for

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

5 of 37 19.12.08 11:59

Page 6: e Law_ Legal Expert Systems_no Way..

reasoning with statements which lack truth value [55] in the sense that they describe norms or imperatives. They cannot be characterised as true or false or logically related to each other or tostatements of fact. [56] McCarty has consistently argued for "intuitionistic" rather than classical logic asthe basis for representing legal concepts. He sets out some theoretical suggestions, as yet unimplemented,for the semantics of normative concepts in legal expert systems. [57] General foundations were laid byvon Wright, [58] who developed a system of logic based on possibility and necessity. According to McCarty, "permission" exists in the union of all states and substates in which an action is necessarily truegiven the conditions of these states. This forms the "Grand Permitted Set" or a boundary condition forlegality. [59] "Obligation" exists in the intersection of these sets. [60] In programming terms,"permission" entails backward chaining from the proposed action to all the states of the world. "Obligation" then is moving forward from all states of the world to find a common action. McCarty designed a language called LLD which allegedly possessed distinct advantages for legalapplications in its use of action terms and deontic language. [61] However, his unimplemented proposalis problematic. LLD attempted to represent law in count terms, mass terms, states, events, actions,permissions and obligations. However, even McCarty admits that LLD failed to represent purpose, intention, knowledge and belief. [62] Jones demonstrates that a striking feature of McCarty's theorem isthat an obligation to act did not logically imply its permission. In particular he demonstrates that underMcCarty's analysis, one could logically derive "permission to poison the King from an obligation to serve him". [63] The only difference from logic programming in the suggested implementation of LLD lies inthe use of fuzzy categories. That it does not stray too far from traditional logic programming is obvioussince LLD is constructed almost entirely from Horn clauses. [64] McCarty therefore fails to tackle moredifficult and fundamentally philosophical problems in deontic logic. [65] Stamper's LEGOL [66] project proposed a number of extensions to enable the system to handle conceptssuch as purpose, right, duty, judgment, privilege, and liability; [67] yet these were never implemented.[68] His latest project, NORMA, [69] has the object of relating all formalised symbols directly to thenotions of agent, intention, and behaviour. However, it is doubtful whether this goal can be achieved,given that his languages are based in typical control structures such as sequencing of rules, if-thenbranches, and iteration, [70] hence easily rewritten as a logic program. [71] Sergot suggests that bothStamper's work and McCarty's LLD have simply taken standard semantics of logical formalism andpresented their own variant. [72] Deontic logic may in any case be non-computational. Since the limit to current technology ultimatelylies in the mechanistic linking of discrete relationships, the modelling of any "normative" aspect of lawwill not by its very nature be amenable to computer processing: "For there is not much sense in asking how . . . by having 255 in register 1234567 licences coming tohave the number 128 in register 450254925." [73] Perhaps in light of this limit to technology, both the Oxford Project [74] and the Imperial College Group[75] have avoided deontic logic. Susskind reduced deontic logic to predicate logic by treating the normative aspect of law as merely linguistic. [76] Deontic labels were attached to different varieties ofmechanical cause [77] and effect. [78] Normative statements were simply rewritten into declarativesymbolic language. [79] The important implication from this work on deontic logic is recognition of the error in equating "logic"as understood by a computer with "logic" as understood in wider contexts. [80] In particular, referencecan be had to MacCormick's distinction between "formal" and "everyday" logic, with the latter beingbased in common sense. [81] Researchers such as Stamper appear to be aware of such difficulties, butfind themselves constrained by the existing technology. Whether legal reasoning can be computational is addressed in Section 5. 3: KNOWLEDGE REPRESENTATION AND THE PROBLEM OF CLASSIFICATION

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

6 of 37 19.12.08 11:59

Page 7: e Law_ Legal Expert Systems_no Way..

The previous Section demonstrated that the process of computer inference was limited to an elaborateprocess of pattern matching. This Section investigates the implications for knowledge representation; inparticular, that it is necessary for knowledge to be represented in terms of IF-THEN rules. It is alsoargued that more "sophisticated" representation techniques, such as "semantic networks" are no morethan elaborations of this basic structure. 3.1 Pattern Matching and Open Texture To be implementable, the knowledge base must be structured so as to be amenable to deductiveinference procedures. In theory, it is possible to use any form of symbolic logic as the representational formalism as long as it is appropriate to a deductive inference engine. This condition requires that theknowledge base must be in the form of pattern matching rules. In logic programming, computation isdeduction, and the task of the programmer is therefore to pose a problem suitable for a deductive process.[82] The major difficulty encountered is what broadly may be termed "semantic indeterminacy". [83] Not alllegal rules are appropriate for application in all situations. [84] Legal expert systems have beenacknowledged to be only capable of solving problems referred to as "clear cases of the expert domain".[85] Yet what is a clear case? The Oxford Project defined a clear or "easy" case as one easily solved byan expert, yet hopelessly difficult for non-experts. [86] Gardner [87] drew the distinction between hardand easy cases by describing the latter as situations whose verdict would not be disputed byknowledgable and rational lawyers, whereas they may rationally disagree as to the former. The real answer for legal expert systems lies in the nature of the computation process. When presentedwith the facts of a case, the expert system must decide whether or not a rule applies. Since "fact situations do not await us neatly labelled, creased, and folded" [88] the difficulty lies in subsumingparticular instances under a general rule. [89] A "hard case" is therefore one where the system fails to match the appropriate pattern, thereby preventing a rule from firing. As computer logic relies on patternmatching, knowledge representation necessarily must encounter problems of classification. [90] What is"ultimately beyond the grasp of a computer," states Detmold, "is not complexity, but particulars". [91] This difficulty is often termed "open texture". The notion of open texture is obtained from Hart'sanalysis. In the now infamous regulation: NO VEHICLES ARE PERMITTED IN THE PARK the open-textured term here is vehicle. The difficulty in terms of legal expert systems is how theprogram can classify an object as being a "vehicle" falling under the rule. Hart suggests that general words like "vehicle" must have a set of standard instances in which no doubts are felt about itsapplication. There must be a "core of settled meaning", but there will be, as well a "penumbra ofdebatable cases". [92] In terms of computation, a case is within the core of settled meaning, and is classified as "easy" wherethere is a matching pattern in the knowledge base. Cases in the penumbra of doubt are hard, since theycannot be classified by the system. The difficulty for legal expert systems, then, is to build a system for classification, so that the pattern matching process can take place. Skalak [93] suggests there are three theoretical models of classification: - the classical model - theprobabilistic model - the exemplar model All three models have been extensively used in expert systems technology. The following analysisdemonstrates that the first two models have little to distinguish them in practical effect, and togetherconstitute an inadequate response to the problem of open texture. The exemplar model has been used asjustification for case based reasoning approaches, but the term has been misused, and such cases are

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

7 of 37 19.12.08 11:59

Page 8: e Law_ Legal Expert Systems_no Way..

argued instead to fall into the "probabilistic" model. The exemplar model is returned to in Section 6,where it is used as the basis for suggested development of computer applications to law. 3.2 The Classical Model In the classical model, a concept is defined by necessary and sufficient conditions. Hafner [94] suggeststhat these conditions can be formally represented by knowledge structures involving decision rulehierarchies, taxonomic hierarchies, or role structures. Decision rule hierarchies specify the conditionsunder which a concept is true or false. "Vehicle" may be defined by a set of characteristics such as "fourwheels", "engine" and so on. In programming terms, this means that the IF antecedents are themselves THEN consequents based on sets of prior conditions which constitute the "definition" of a term. Thisquickly builds into a "decision tree" structure. [95] Statute law, particularly statutory definitions, areseen particularly suitable for rendering into what amounts to typical Horn clauses. [96] A prominentexample of this technique is the modelling of the British Nationality Act 1981. [97] Others include theUnited Kingdom supplementary benefits legislation, [98] and STATUTE, now used by some Australiangovernment departments. [99] Taxonomic hierarchies define sub-types of concepts, placing different objects into groups andsub-groups. For instance, the class "vehicle" may have among its sub-classes "car", which in turn maybe further sub-classed into "Toyota", "sedan" and specific instances based on model types, years, and soon. [100] Any taxonomic hierarchy can, however, be represented in terms of a chain of decision rulehierarchies, or IF-THEN rules. Role structures are modelled in "frames", and "semantic networks". Asemantic network [101] is a collection of objects called "nodes". These are connected by "links". [102] Typical links include "is a" links to represent class-instance relationships and "has a" links to represent part-subpart relationships. Interconnected, these may quickly build into a complex web of relationships. A frame [103] is a subset of a semantic network, being a representation of a single object with a set of"slots" for the value of each property associated with the object. All links in both semantic nets andframes are, however, functionally equivalent to taxonomic hierarchies. Hayes demonstrates that bothsemantic networks and frames are no more than elaborate logic programs, and concludes that they holdno new insights. [104] The semantic network may be forward and backward chained as a set of logicalrules, just as would a rendering of a set of Horn clauses in PROLOG. [105] For instance, in McCarty'sTAXMAN project [106] the domain was modelled in terms of objects such as corporations, individuals, stocks, shares, transactions &c. Each object is described by a "template", being a collection of theobject's properties, such as name, address, size, and value. These properties are then linked and indexed. Each "bundle of assertions" constitutes the object's "frame". [107] These structures are aimedat answering questions such as: "Does the taxpayer and her family have a controlling interest in the stock of a company which is apartner in a partnership which owns an interest in XYZ Ltd?" [108] In TAXMAN 2 McCarty proposed a more elaborate semantic net based on a "prototype-plus-deformation" model. Essentially this sets one frame as being the default for each class of object,with incremental modifications to slots based upon fuzzy categories. Unfortunately the concept wasnever implemented. As a result, McCarty offer no solution to algorithmic issues such as how to choose,index, and search the space of prototypes, and their relationships to actual cases. [109] 3.2 The Probabilistic Model Some argue that legal concepts cannot be adequately represented by definitions that state necessary andsufficient conditions. Instead legal concepts are incurably open-textured. [110] Typically an expert system associates some kind of "certainty factor" with every rule in its knowledge base, obtained fromprobabilities, or fuzzy logic, or some combination of the two. [111] Firstly, probabilities are used alongside facts and rules, as a "slot" in the knowledge base. To each fact and rule is attached a certaintyfactor between zero and one. Concepts are mechanically linked, but the final output includes a

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

8 of 37 19.12.08 11:59

Page 9: e Law_ Legal Expert Systems_no Way..

composite probability. For example: A is true (0.8 chance) If A is true then B is true (0.75 chance) \ B is true (0.6 chance = 0.8 x 0.75) Secondly, fuzzy logic is called into play when classification of facts involves weighting particularfeatures. In the former the rule "fires", but a certainty level is attached to each fact and rule. In thelatter, the rule will only fire at defined threshold certainties. If the weighted average of a set of characteristics add to a threshold amount, the facts are classified accordingly. 3.3 Case Based Reasoning Using precedents by induction and analogy is seen as advantageous in overcoming apparently intractableproblems of classification. [112] However, both analogy and induction are inherentlynon-computational. [113] Case based reasoning is one attempt to imitate these techniques, allegedlybased on an exemplar model. In the exemplar model, the user is presented with prototypical instances, or"mental images", on which to base her classification. This approach differs significantly from the twoprevious models in that it is primarily designed to leave the task of classification to the user. Most case based legal expert systems instead use a database of examples linked to the decision given in particularcases. When presented with a new case for decision the system will attempt to match the case under consideration with the examples, either stereotypical [114] or actual, in its database to extract thosewhich appear to be most similar. On that basis it will attempt to predict the outcome of the new case. For instance, Popple's SHYSTER [115] applies rules until the meaning of some open texture concept isrequired. At this point a case based reasoning mechanism attempts to resolve this uncertainty. [116] A matching algorithm is used to measure the similarity of cases in terms of "case features". Each case ismodelled as a frame with significant features contained in "slots". [117] These features are thenweighted by some statistical method. [118] The object of constructing these "similarity metrics" [119] isto retrieve the most "on-point" cases. [120] Using weighted characteristics is described as a"dimensional" [121] approach, such as Betzer's "3-D" system which uses relative weights in a "proceduresweep" to fill in "gaps" in the knowledge base. [122] In addition to the facts, the cases themselves are often weighted in a manner "meaningful to lawyers",usually to reflect some sense of stare decisis. For instance, the weights might be determined by the levelof the tribunal. [123] Although "case based" reasoning allegedly possesses an advantage over rule based systems by theelimination of complex semantic networks, [124] it suffers from intractable theoretical obstacles. Apartfrom the question of choice of a matching algorithm, without some further theory it cannot be predictedwhat features of a case will turn out to be relevant. Too often, "legally significant parameters" [125] arefacts deemed important by the programmers, [126] with no grounding in any articulated theory, eventhough the utility of such systems depends critically on the set of attributes selected. [127] Bothselection of attributes and the choice of associated weights are therefore highly arbitrary. [128] On this analysis, case based reasoning constitutes an extension of the probabilistic model rather than atrue exemplar model, in which the task of classification is left to the user. The potential of this lattermodel for building applications is examined in Section 5. 4: PHILOSOPHICAL IMPLICATIONS OF THE DEDUCTIVE INFERENCE ENGINE The previous Sections have demonstrated that the task of knowledge representation was to provide asymbolic representation of knowledge in a form amenable to the deductive inference engine. Theprimary reference point was logic programming, that is, formalisation of the law into a set of Horn

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

9 of 37 19.12.08 11:59

Page 10: e Law_ Legal Expert Systems_no Way..

clauses. It was also argued that techniques such as "fuzzy logic" "semantic networks" and "case basedreasoning" are no more than elaborations of logic programming, [129] and not, as some would argue,"second generation" systems going beyond deductive inference. [130] This Section carries this analysis beyond the practical and into the philosophical. It has been thoughtinescapable that a legal expert system which attempts to emulate the reasoning processes of a lawyer must embody theories of law that must in turn rest on more basic philosophical assumptions. [131] Building a legal expert system is thus described as not being just an exercise in computer programming, but requires "solid and articulated" jurisprudential foundations. [132] Researchers in this field appear,however, to have discounted or ignored the value of close analysis of the field's theoretical assumptions. This Section demonstrates how, to avoid theoretical obstacles, the nature of law, its epistemological basis,and the task of jurisprudence have all been subjected to unacceptable distortion. 4.1 Isomorphic Representation or Distortion? The activity of legal knowledge representation is said to involve the operation of interpretative processeswhereby the formal sources of part of a legal system are scrutinised and analysed, so as to be both faithful in meaning to the original source materials, and in a form which is computer encodeable. Thisprinciple is termed "isomorphism". Doubts have been raised as to whether it is possible to meet both objectives. It is said that theknowledge engineer must desist from imposing her own interpretations, lest she be universallycondemned for misrepresenting the law. [133] Yet it is difficult to reconcile Susskind's claim ofisomorphism with his admission that the process is in fact one of complete "reformulation" or "rational reconstruction". [134] Although isomorphism requires the formalisation of rules to be sufficiently expressive to capture theiroriginal meaning, [135] Levesque and Brachman have demonstrated that there is a significant trade offbetween the expressiveness of a system of knowledge representation and its computational tractability.[136] Susskind admits that it is not possible, without "extensive modification and inconvenience" toaccommodate legal knowledge within the restrictive frameworks offered by currently available computerprogramming environments. [137] Moles provides a typical example of the modelling of British coal insurance claims. [138] This involvedtaking statutes and cases and then "translating" them into six different structures using three separateapplications into the target representation language. After being "translated, cut up into bits, precised,further analysed into [frames], which are then stored in another structure", he suggests it would be a"miracle" if they were "isomorphic" to the original texts. [139] It would appear that terms such as"isomorphism" may be no more than "syntactic sugar" [140] used to "sweeten" the acceptability of whatmust be a distorting process. 4.2 Law as a System of Easily Interpreted Rules Statutory interpretation has been predominantly characterised as involving a literal interpretation, [141]particularly in tax law, [142] allegedly idiosyncratic in being construed both literally and strictly. [143] In case law, legal sources cannot be as easily "formalised" or "normalised", [144] but must to somedegree be interpreted. However, Susskind takes a dangerous step in suggesting that the task of theknowledge engineer is to "sift the authoritative ratio decidendi from the text eliminating obiter dicta andother "extraneous" [145] material. Moreover, he argues that this can be easily extracted, not by athorough examination of the case but by reading the headnote alone. [146] Representation of cases in knowledge bases typically are compressed into a "headnote" style, including citation, court, date, facts,and holdings. [147] Cost may also be a factor behind this characterisation of law. Susskind suggests thatknowledge engineers need only avail themselves of the services of the legal expert to "tune" theknowledge base. [148]

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

10 of 37 19.12.08 11:59

Page 11: e Law_ Legal Expert Systems_no Way..

This view, however, that the law is a formal rule-governed process, ignores a great deal of learningstretching back over a century - and more recently in the form of critical legal studies - arguing that thelaw is far from determinate. [149] The law is at least an "elastic" phenomenon in which students havetraditionally been taught and encouraged to "flip" legal argument. [150] The conception of legaldecision-making as a formal rule-governed process has been eroded by a judicial move towards "realist"scepticism of rigid rule structures. For example, members of the Australian High Court have indicated arejection of formalism and adoption of a more active assessment of legal principles with respect to justice,fairness, and practical efficacy. [151] Advocates of case based reasoning attempt to accommodate realist criticism by suggesting that factpatterns can explain legal decisions independently of any "surface discourse" of law. [152] The criticalassumption is that judges decide even hard cases in a rule based manner. Levi [153] supports this view inarguing that legal reasoning, while not being purely a system of applying the law mechanically to facts,does embody rules obtained by analysing the similarities and differences in decided cases. Suchresearchers argue analogously to theorists, such as Goodhart, who suggest examination ought to befocussed on the facts treated as material and immaterial by the court. [154] Stone, however, argues that there is a critical distinction between the ratio which explains the decisionand the one which binds future courts. More often than not, the critical facts are those treated as material by the later court, and even if they are identifiable, they can be stated at multiple levels ofgenerality. [155] Other legal systems, particularly in some parts of Europe, may be more suited to this characterisation oflaw. For example, in the Scandinavian legal system, one overall guiding principle is the prohibition ofdecisions which are non liquet [156], which is considered a serious fault. [157] 4.3 Change A severe impediment to the routine use of knowledge based technology for practical legal applicationslies in the unresolved problems associated with the "maintenance" of such systems, that is, how to continually update the system with primary sources. [158] One group describe how after exhaustivelystudying over 1,000 cases under the Canadian Unemployment Insurance Act, it was amended in 1990rendering their work irrelevant. [159] Most approaches are inadequate, either for expressly assuming aconstant state of the law, or avoiding primary sources and instead modelling directly the heuristics of the expert. [160] The logic programming approach of the Imperial College group, whereby the expert system is formalisedto correspond to individual sections of a statute, is argued to be easily modifiable. However, a fully functioning expert system requires a layer of pertinent heuristic knowledge to avoid a "layman's reading"of an Act. [161] Once the formalisation is structured, explained and augmented in this way, modifyingthe system is no longer straightforward. Schlobohm suggests that, as a result, human experts would haveto modify the heuristic rules whenever the law changes, and the entire system containing the new ruleswould then have to be debugged. [162] Similarly, use of modular approaches such as the Chomexpertsystem have proved inadequate. [163] It is difficult to encode statutory rules at even the most basiclevel without making inappropriate commitments as to how they will be interpreted in future. [164] 4.4 Epistemology of Law Susskind suggests that law is not an abstract system of concepts and entities distinct from the "marks onpaper" that are the material symbols of it. [165] The difference between legal expert systems and scientific systems such as PROSPECTOR and MYCIN lies, according to Susskind, in that scientific lawsare to be "discovered" in the empirical world in general, while legal rules can be extracted, as an acontextual linguistic exercise, from scrutiny of formal legal sources. Under this analysis, knowledge

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

11 of 37 19.12.08 11:59

Page 12: e Law_ Legal Expert Systems_no Way..

engineers need go no further than the written text, hence Susskind argues that the "bottleneck" of knowledge acquisition is effectively dissolved. This is a dangerously narrow epistemology to adopt, [166]since researchers in this area do not sufficiently distinguish between the writing, and the meaning of thewriting. [167] In Section 5 it is argued that meaning can only be found in a social context. 4.5 The Nature of Jurisprudence Although the foregoing suggests that many theories in jurisprudence conflict significantly with importantassumptions of expert systems technology, many of these fundamental theoretical difficulties have beendownplayed or eliminated. When faced with theories which imply, for instance, that there is no future forexpert systems, some researchers have expressly rejected the usefulness of jurisprudence. [168] Critical legal theory is therefore characterised as "unacceptable". [169] Even if jurisprudence is whollyignored by knowledge engineers, they suggest that the only risk is that the systems they design might beof some "inferior quality". [170] Others "rationally reconstruct" jurisprudence into an acceptable form. For instance, Susskind asserts thatthe activity of any "legal science" is to impose order over unstructured and complex law by recasting itinto a body of structured, interconnected, coherent, and simple rules. [171] Smith and Deedman gofurther and argue that the task is to transform apparent indeterminacy into a completely rule-governedstructure. [172] The same can be said for the portrayal of the epistemology of jurisprudence. Just as "complex" law isrecast into "simple" rules, the task of Susskind was to take "confused and perplexed" jurisprudence, andobtain "consensus" over relevant issues. What Susskind does to find "consensus" in legal theory, is toallegedly statistically sample the literature. [173] However, the "sample" was limited firstly to works ofanalytical jurisprudence, and secondly to British writings from the mid-1950's. [174] Perhapsunsurprisingly, the influence of H.L.A. Hart's concept of law as a system of rules was overwhelming. Asan adjunct, Susskind further asserted that to be "jurisprudentially impartial", that is, to embody no"contentious" theory of law, an expert system must reason only with rules. Any facility for reasoningwith non-rule standards [175] was rejected out of hand. A significant internal inconsistency emergeswhen it is appreciated that Susskind believed that it was sufficient justification for use of rules that this"consensus" identifies legal rules as necessary but insufficient for legal reasoning. [176] Perhaps the clue to why these works were chosen lies in the fact that they constituted "the sourcematerials with greatest potential given the overall purpose of the project". Susskind notes that his work was intended to "eliminate much of the future need for extensive scrutiny of non-computationallyoriented contemporary legal theory". [177] Here the inference engine is most clearly "driving" jurisprudence. 4.6 Jurisprudence Turned on its Head Niblett claimed that "a successful expert system is likely to contribute more to jurisprudence than theother way around". [178] If the suggestions of researchers such as Susskind are taken seriously, theyturn jurisprudence on its head. Theory is not used as a basis for practice, but instead implementability intechnology is used as the touchstone for accepting the truth or falsity of the theory. A particular featureof artificial intelligence literature is that its rigour lies not in experimental corroboration, or any theoretical soundness, but implementability. [179] Hofstader [180] suggests that so long as the artificialresearcher takes care to construct theories which can be written down as a sequence of algorithmic orcomputational steps, these theories can be implemented, thereby "confirming" the theories underlyingthe process. Implementability per se leads to a self-perpetuating methodology: since an artificialintelligence researcher will use concepts of computational theory to construct theories, it is necessarilyimplementable. Legal expert systems researchers fall into this model by rejecting "unacceptable" legal theories, and

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

12 of 37 19.12.08 11:59

Page 13: e Law_ Legal Expert Systems_no Way..

reformulating the remainder in computational terms, to eliminate potential obstacles to the prosperity oftheir research programme. This abandonment of serious inquiry into jurisprudence by researchers intolegal expert systems may give credence to Kowalski's fears that the field may have cut itself off as aspecialist discipline and established its parameters prematurely. [181] Brown notes that at a 1991conference, few if any papers questioned the basic assumptions of the field. [182] Niblett claimed that "a successful expert system is likely to contribute more to jurisprudence than theother way around". [183] The foregoing demonstrate that these words ring true. Law and jurisprudence,to form an acceptable basis for expert systems research, has been reformulated in computational terms, toeliminate philosophical "technicalities". [184] 4.7 Failure to Recognise Limitations Leith has argued for a rejection of legal expert systems on the basis that they simplify the law to such anunacceptable extent that they have little or no value in legal analysis. [185] Yet while some engineers oflegal expert systems may be fully aware of the limitations already discussed, it is not necessarily the casethat other researchers, and more importantly, the users of these programs will also be so mindful. Thisarticle agrees with Leith's implied suggestion that many accounts of work in this area refuse to acknowledge that there are significant limitations. For example, McCarty felt able to say: "[Law] seems to be an ideal candidate for an artificial intelligence approach: the "facts" would berepresented in a lower level semantic network, perhaps; the "law" would be represented in a higher level semantic description; and the process of legal analysis would be represented in a pattern-matchingroutine." [186] Susskind has, however, admitted that expert systems might not be amenable to corporate, commercialand tax law, but would be apposite, for example, but to limited instances such as the Scottish law relatingto liability for damage caused by animals. [187] Such limitations, often given little attention, should bemade clear, and "plausibility tricks" avoided. [188] There is a very real danger that users willsignificantly overestimate the value of the analysis they obtain from such a program, particularly in lightof the wealth of optimistic literature and when it is described as "expert". 5: LEGAL REASONING AS AN INTRACTABLY COMPLEX SYSTEM The previous Section demonstrated how law and jurisprudence have been unacceptably distorted to beamenable to expert systems technology. Moles suggested that researchers have deliberately ignored fundamental problems since they were committed to the use of a "particular computing tool", and not tothe understanding of law. [189] This article identifies this tool as the inference engine itself. Thefollowing section addresses the non-computational nature of legal inference. 5.1 Search for a Deep Model There has been a growing trend in legal expert systems to speak of "deep knowledge" or "conceptualknowledge" as something distinct and preferable to "shallow" knowledge. [190] McCarty calls for the development in law of "deep" systems akin to CASNET [191] in which the disease is represented as adynamic process. [192] The depth of a system has been described as the extent to which programmescontain not only rules for mapping conclusions onto input scenarios, but also a representation of theunderlying causes. [193] The Imperial College Group suggest that deep structure in legislation is the isomorphism to thatlegislation, on the basis that each Horn clause represents some clause in the legislation. [194] Inaddition, case based reasoning has been described as employing a "deep structure". [195] The advantagestemming from both of these descriptions is that they cast deep structure into computational terms. [196] This is another example of technology driving the underlying theory.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

13 of 37 19.12.08 11:59

Page 14: e Law_ Legal Expert Systems_no Way..

On the other hand, McCarty argues that resolution of the difficulties of open texture are related to asense of "conceptual coherence". [197] In addition, while theoretical approaches are emerging to copewith problems of legal change, [198] a unifying theme is a striving for an undefined "normativeenrichment". [199] This Section argues that deep structure is to be found in social context and purpose,which are non-computational. 5.2 Interpretation in a Social Context of Shared Understanding Law is not, as legal expert systems would portray it, self-contained and autonomous, [200] but in fact isembedded in social and political context. That legal concepts draw upon ordinary human experience is precisely what makes them so difficult for an artificial intelligence system. [201] Whenever humanbehaviour is analysed in terms of rules, it must always contain a ceteris paribus condition; in essencereferring to the background of shared social practices, interests and feelings. Even if we acceptSusskind's characterisation of law's ontology as going no further than the "marks on paper", semanticproblems will still arise since these marks are not created in a vacuum, but are the result of purposive social interaction, and must be so interpreted. [202] Using one recognised example, the injunction: DOGS MUST BE CARRIED ON THE ESCALATOR can only be interpreted based on the understandings, for instance, that a dog's small feet may becometrapped in slots and moving parts; that humans generally feel some concern for dogs, and therefore do not wish to see them "mangled". Thus an adequate interpretation of any rule requires that we locate it ina complex body of assumptions. [203] Minsky noted that intelligent behaviour presupposes a background of cultural practices and institutionswhich must be modelled if computer representations are to have any meaning. [204] Wittgenstein'sarguments that the meaning of language must be based in social use and a community of users are worthrereading in the light of Searle's Chinese room analogy. [205] How can the computer have this sort ofdirect access to language? [206] Kowalski and Sergot admit that a computer must operate by "blind" and"mechanical" application of its internal rules. [207] 5.3 Open Texture as an Intractable Problem If legal reasoning was really some "pointing" [208] or "cataloguing" [209] procedure, Hart's suggestionthat the task of legal institutions is to approach greater refinement in definition by adjudicating [210] onparticular cases, may be attractive. [211] Open texture may then be marginalised by a progressiverefinement of categories; in computational terms, weaving a more elaborate semantic net. To modelsocial context in a knowledge base, however, may be an impossible task. Popper demonstrates that context entirely depends on point of view. [212] Harris suggests that any viewof the law must be a phenomenological one which takes account of shifting foci of interest. [213] Thedifficulty is that a great deal of social context will not be "conscious" and expressible, but will constitute a hidden set of assumptions on which human decisions will be based. It is impossible to focus attentiononto elements of that context without creating a new subconscious context. Polyani describes this as thedifference between focal and subsidiary awareness. [214] As Berry [215] demonstrates, if people learn to perform tasks so that important aspects of theirknowledge are implicit in nature, then knowledge engineers will be unable to extract this knowledge and represent it in a meaningful way in an expert system. [216] Husserl, for instance, discovered thatconstruction of even simple "frames" involved coping with an ever expanding "outer horizon" ofknowledge. He sadly concluded at the age of 75 that he was a "perpetual beginner" engaged in an

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

14 of 37 19.12.08 11:59

Page 15: e Law_ Legal Expert Systems_no Way..

"infinite task". [217] This is a fundamental difficulty with artificial intelligence in all its applications.[218] 5.4 Purposive Interpretation and Intention Hempel [219] argued that ad hoc modifications to a theory were limited by the increased complexity ofthe theory and that, after a certain threshold level of complexity was exceeded, scientists would naturallyand logically pursue simpler alternative theories. [220] Here we may learn from science. Certainphysical and chemical systems have been discovered that display uncanny qualities of co-operation, ororganise themselves spontaneously and unpredictably into complex forms. These systems are still subjectto physical laws, but laws that permit a more flexible and innovative type of behaviour than the oldmechanistic view of nature ever suggested. The lesson from chaos theory is that seemingly complexsystems can be defined in terms of simple but not mathematically tractable models. [221] Legal reasoning is not mechanical. [222] Social context and shared understandings can be dealt with interms of the simple, elegant, but non-computational model of purposive interpretation. Searle's Chineseroom analogy identifies intentionality as the benchmark of the mental, and refutes claims that intentionalmental predicates, such as meaning, understanding, planning, and inferring, can be attributed to amathematical computational system. [223] Susskind prefers to avoid purposive theories, [224] since such theories imply that law is not simply aquestion of linguistic pattern matching but instead involves examination of social practices and humanintentionality. [225] Similarly, case based reasoning is seen as a way around having to tackle "full blown"statutory interpretation involving legislative intent. [226] Law is a practical enterprise, concerned to guide, influence or control the actions of citizens. Since anyaction is purposive, any philosophy of action must be a philosophy of purposes. [227] When a courtapplies, say, the statutory term of our example, "vehicle", to a particular contraption, the meaning of"vehicle" is found in an analysis not only of the purpose of the law, but of the purpose for which thevehicle was to be used. [228] For example, Fuller responded to Hart in these terms: "What would Professor Hart say if some local patriots wanted to mount on a pedestal in the park a truckused in World War 2, while other citizens, regarding the proposed memorial as an eye-sore, support theirstand by the "no vehicle" rule? Does this truck, in perfect working order, fall within the core or thepenumbra?" [229] One could make similar arguments when a "NO DOGS ALLOWED" sign confronts a seeing eye dog, orone that is stuffed or anaesthetised. [230] It is difficult to reconcile Hart's acontextual approach to legalinterpretation with his own view of actors within the legal system holding an internal normative view ofthe rules. [231] Following a rule equals "obeying the law" only where a purposive personal commitmenthas been made to a rule structure. [232] Applying modern literary and linguistic theory to the law, [233] some suggest that no text has meaningwithout the active participation of the reader, [234] and an "interpretive community" of which the reader is a part. [235] The use of figurative language, imagery and metaphor is integral to legal discourse. [236] Ideological symbolism is inescapable. [237] What counts as the relevant facts depends entirely oncontext, [238] and cannot be determined by programmers ex ante. [239] Language is the very conditionof intention, and intention is the vehicle of meaning. [240] 5.5 Humanistic Implications The implication of the foregoing suggests that the law cannot be amenable to a legal expert system, asthis involves denying social context, purpose, and essentially humanity. A humanistic critique wouldargue if expert systems have any degree of success in modelling "the law", the result would be

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

15 of 37 19.12.08 11:59

Page 16: e Law_ Legal Expert Systems_no Way..

"profoundly humiliating". [241] Weizenbaum stated that if artificial intelligence fulfils its promises thenthis implies that man is merely a machine. [242] In similar vein, the success of legal expert systems mightimply that the law itself is a machine, and that lawyers, perhaps even judges, can be replaced bycomputers. 6: THE WAY FORWARD The preceding sections have demonstrated that the use of a deductive inference mechanism, and theconsequent need for knowledge represented to be amenable to such an engine, will lead to unacceptabledistortion of both the law, its philosophical underpinnings, and its humanity. How are lawyers theneffectively to utilise the information technology resource? This article adopts the basic message inTapper's insightful 1963 piece. [243] The range of activities to which computers ought be used must belimited to activities which can be reproduced by the machines. Tapper tentatively describes thedistinction as one between "mechanical" and "creative" tasks. [244] If the argument of this article is accepted, the way forward involves relocation of the inference engine from the computer to the humanuser. This section explores possibilities for "decision support systems", which presents material to theuser on which she alone performs the specifically legal reasoning. [245] 6.1 Decision Support Systems Recall that in the exemplar model, the user is presented with prototypical instances, or "mental images". This approach differs from the other models of classification in that it is primarily designed to leave thetask of classification to the user of the system. The reason why the user, rather than the machine, ought perform the legal inference is that legal reasoning is non-computational, as Section 5 has demonstrated. Despite growing recognition that research perhaps ought to be oriented towards "decision supportsystems", such systems have been designed to first reason with the legal data and then present such reasoning to the user to support her conclusion. [246] This approach is hazardous since it maypredetermine the human conclusion to a large degree. [247] To dispute the computer inference the userwould require knowledge of the area of law to a degree where the computer would not need to be havebeen consulted in the first instance. [248] Decision support systems, then, differ significantly from expert systems in that the heart of the problem -the inference engine - is relocated in the user of the system. Computers should then be utilised formechanical and time-consuming tasks for which they are best suited. In particular, this Section suggeststhree significant uses: - structured legal information retrieval; - calculation based on strategies; and -litigation support and "legal econometric" systems. 6.2 Legal Information Retrieval Firstly, searching for primary and secondary legal sources is a costly and to a significant degree amechanical exercise. Efficient retrieval of legal information is vitally important. Tapper suggested in1963 that lack of resources to those operating outside provincial centres, and concentration of materialswithin large organisations was productive of injustice in favour of powerful sections of the community.[249] Modern statute and case databases have gone some way to addressing this problem. Generally, systems such as LEXIS, SCALE, and INFO1 use Boolean keyword search routines, [250] butthese have obvious disadvantages. [251] Some limited advances have been made with, for instance, "Hypertext" cross-referencing, [252] and "probabilistic" elaboration of keyword searches. [253] It haslong been assumed that retrieval based on the meaning and content of documents, and indexed in terms of legal concepts, [254] would be far more appropriate. [255] A variety of techniques have emerged forindexing, including use of discrimination trees, [256] and explanation based generalisation. [257]Research is progressing towards a "hybrid" approach of linking case databases with statutory material andlegal texts. [258] Hafner [259] has constructed a database on United States negotiable instruments law

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

16 of 37 19.12.08 11:59

Page 17: e Law_ Legal Expert Systems_no Way..

designed to retrieve cases based on typical problems which arise in legal disputes. McCarty has suggested that it would be more fruitful to look at legal argument than to develop a theoryof correct legal decisions. [260] Similarly Bench-Capon and Sergot suggest that open texture should be handled by giving the user for and against arguments in borderline cases. If so, a computer system will beconcerned, not with the production of a conclusion, but rather with presenting the arguments on whichthe user may base her own conclusions. [261] On this basis, Ashley and Rissland have designed a system called HYPO, [262] which emerged fromRissland's earlier work on reasoning by examples. [263] It does not use an inference engine for legal analysis but instead aims for conceptual retrieval based on structure of legal argument. [264] Thesystem's inference engine is used for some statistical processes used to decide which primary materials to retrieve. The cases relevant to the issues identified by the user are retrieved and arranged in terms ofargument for and against a decision in a new case. [265] The system is further supplemented by a set of"hypothetical" cases. [266] The actual legal inference on the basis of the material retrieved is left to theuser of the system, which distinguishes HYPO and similar text retrieval systems from many of the casebased reasoning systems earlier. 6.3 Calculations and Planning A second application would be to utilise the mathematical functions of computer systems. The computerperforms inference, but essentially calculates outcomes based on strategies already formulated by anexpert who has himself interpreted the legal materials. Michaelson's TAXADVISOR [267] is oneexample. It calculates tax planning strategies for large estates, based on strategies obtained from lawyersexperienced in tax advice. There is little more legal inference in this than calculating a share portfolio tomaximise return based on a broker's personal model. Similarly, systems have been suggested which willassist in financial planning, for instance by forecasting retirement pensions. [268] 6.4 Litigation Support and Jurimetrics Finally, a decision support system may more clearly focus on litigation strategies. These may bedeveloped with the assistance of expertise, or by techniques such as hypothesis and experiment. [269] Such systems do in fact make inferences, but these are not inferences of law, but inferences based onstrategies already defined by expertise or essentially what amounts to empirical research. In that sense,the inference procedures are extensions to the calculation and planning examples. One example is the LDS [270] system, implemented in ROSIE [271] by Waterman and Peterson. Itadvises on whether to settle product liability cases, and an advisable amount, based on factors such as abilities of the lawyers, characteristics of the parties, timing of claim, type of loss suffered, and theprobability of establishing liability. The primary goal of LDS was not to model the law per se but ratherthe actual decision making processes of lawyers and claims adjusters in product liability litigation. Another example is SAL, [272] intended to advise on an appropriate sum to settle asbestos injuryclaims. In such systems, the computer is modelling non-legal factors which may influence the outcome ofa case, in order to assist the lawyer in deciding her ultimate strategy. In Australia the GovernmentInsurance Office has developed COLOSSUS, a sophisticated system to detect possible fraudulentpersonal injury claims, and tag them for investigation by its officers. [273] Similar systems have also been suggested as aids not only in litigation, but dispute resolution strategies.[274] The information contained in such systems may as an adjunct constitute an important resource forsociological study, such as Bain's modelling of subjective decisions of judges of particular varieties ofcrime in the United States. [275] In this case, the expert system constitutes jurimetrics, a legal version ofeconometrics. 7: SUMMARY AND CONCLUSIONS

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

17 of 37 19.12.08 11:59

Page 18: e Law_ Legal Expert Systems_no Way..

Computerisation of the legal office will continue, but the message from this article is that researchersmust be acutely aware of the philosophical underpinnings of their work. In particular, the usefulness oflegal expert systems is severely questioned. Use of such systems has involved an unacceptable level ofdistortion both of the nature of law and of jurisprudence. This is not a case of "carbon", [276]"biological", [277] or even "neural" [278] chauvinism, but a demonstration that expert systemstechnology have made a poor choice of domain in law. Blame has been laid for such distortion on thecore of the expert system: the pattern matching inference engine. Legal inference, on the other hand,relies on purpose and social context, implying that computational models of sufficient richness are nottractable. This article suggests that, given current limitations of computer technology, the quest for an artificiallyintelligent legal adviser is misguided. In the future, however, these limitations may be overcome. Forexample, work being undertaken in parallel distributed processing is producing significant results withrespect to low level "intelligent" processes, including perception, language, and motor controls. This isbased on the assumption that intelligence emerges from interactions of large numbers of simpleprocessing units, and represents a significant break away from increasingly complex rule-basedstructures. [279] While this article cannot address such possibilities within future technology, it issuggested that the basic pattern matching /rule-governed principles will limit computers for some time. Itis therefore suggested that researchers instead investigate decision support systems as a more usefulalternative. Relocation of the inference engine will mean that knowledge representation will no longerneed be amenable to computational inference, but human inference. The computer's inference engine should instead be used instead for searching procedures, and computation. Some possibilities have beennoted. Ardent advocates such as Tyree suggest that despite their difficulties, legal expert systems are acost-effective second-best solution. The choice is portrayed not between human advice and machineadvice, but in an era of high costs of justice, between machine advice and no advice at all. [280] While economic factors are important, [281] humanistic factors must not be forgotten. Law plays animportant role in modern civilisation. It must maintain a close relationship with the social and politicalforces shaping society, and not merely regress into a "technology", a tool to be used by competing socialforces. [282] ENDNOTES * J Searle, "Minds, Brains and Programs" (1980) 3 Behavioural and Brain Sciences 417. 1 For examples see NJ Bellord, Computers for Lawyers (Sinclair Browne: London, 1983); and T Ruoff,The Solicitor and the Automated Office (Sweet & Maxwell: London, 1984). 2 q.v. J Bing (ed.), Handbook of Legal Information Retrieval (North-Holland: Amsterdam, 1984). 3 See Section VI, infra. 4 Inferring molecular structure from mass spectroscopy data; q.v. RK Lindsay, BG Buchanan, & JLederberg, Applications of Artificial Intelligence for Chemical Inference: The DENDRAL Project (McGraw-Hill: New York, 1980). 5 Advising on the location of ore deposits given geological data; q.v. RO Duda & R Reboh, "AI andDecision Making: The PROSPECTOR Experience" in W Reitman, Artificial Intelligence Applicationsfor Business (Ablex Publishing: Norwood, 1984). 6 Providing consultative advice on diagnosis and antibiotic therapy for infectious diseases; q.v. BG

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

18 of 37 19.12.08 11:59

Page 19: e Law_ Legal Expert Systems_no Way..

Buchanan & EH Shortcliffe, Rule-Based Expert Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project (Addison-Wesley: Reading, 1984). 7 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.11. 8 Ibidem p.15. 9 L Loevinger, "Jurimetrics: The Next Step Forward" (1949) Minnesota Law Review 33. 10 L Mehl, "Automation in the Legal World: From the Machine Processing of Legal Information to the'Law Machine'" in Mechanisation of Thought Processes (HMSO: London, 1958) p.755. 11 RW Morrison, "Market Realities of Rule-Based Software for Lawyers: Where the Rubber Meets theRoad" (1989) Proceedings Second International Conference on Artificial Intelligence and Law 33 at p.35. 12 c.f. RA Clarke, Knowledge-Based Expert Systems (Working paper: Department of Commerce,Australian National University, 1988) p.6. 13 Primarily RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford,1987). 14 Ibidem p.44; emphasis added. 15 MJ Sergot, "The Representation of Law in Computer Programs", Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications (Academic Press: London, 1991) at p.4. 16 P Harmon & D King, Expert Systems: Artificial Intelligence in Business (John Wiley & Sons: NewYork, 1985) at p.5. 17 R Forsyth, "The Anatomy of Expert Systems" Chapter Eight in M Yazdani (ed.), ArtificialIntelligence: Principles and Applications (Chapman & Hall: London, 1986) pp.186-187. 18 R Forsyth, "The Anatomy of Expert Systems" Chapter Eight in M Yazdani, Artificial Intelligence:Principles and Applications (Chapman and Hall: London, 1986) p.194. 19 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.46. 20 Ibidem p.47. 21 F Hayes-Roth, DA Waterman & DB Lenat Building Expert Systems (Addison-Wesley: London,1983) p.4. 22 e.g. The Latent Damage Adviser; q.v. PN Capper & RE Susskind, Latent Damage Law - The ExpertSystem (Butterworths: London, 1988). 23 J Vaux, "AI and Philosophy: Recreating Naive Epistemology" Chapter Seven in KS Gill (ed.),Artificial Intelligence for Society (John Wiley & Sons: London, 1986) p.76. 24 T Winograd & F Flores, Understanding Computers and Cognition: A New Foundation for Design

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

19 of 37 19.12.08 11:59

Page 20: e Law_ Legal Expert Systems_no Way..

(Ablex: Norwood, 1986). 25 HL Dreyfus & SE Dreyfus, Mind over Machine (Basil Blackwell: Oxford, 1986). 26 A Hart & DC Berry, "Expert Systems in Perspective" in DC Berry & A Hart (eds) Expert Systems:Human Issues (MIT: Cambridge, 1990) p.11. 27 e.g. J Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation (WHFreeman & Co: San Francisco, 1976). 28 R Hellawell, "A Computer Program for Legal Planning and Analysis: Taxation of StockRedemptions" (1980) 80 Columbia Law Review 1363. See also NJ Bellord, "Tax Planning by Computer"in B Niblett (ed.), Computer Science and Law (Cambridge University Press: New York, 1980) p.173. 29 R Hellawell, "CHOOSE: A Computer Program for Legal Planning and Analysis" (1981) 19Columbia Journal of Transnational Law 339. 30 R Hellawell, "SEARCH: A Computer Program for Legal Problem Solving" (1982) 15 Akron LawReview 635. 31 P Jackson, H Reichgelt & Fv Harmelen, Logic-Based Knowledge Representation (MIT: Cambridge,1989). 32 Implemented in QUINTUS PROLOG; q.v. NH Minsky & D Rozenshtein, "System = Program +Users + Law" (1987) Proceedings First International Conference on Artificial Intelligence and Law 170. 33 Symbolic logic has had a profound influence in the artificial intelligence field; for a description see ICopi, Symbolic Logic (Macmillan: New York, 1973). 34 MJ Sergot, "A Brief Introduction to Logic Programming and Its Applications in Law" Chapter Fivein C Walter (ed.) , Computer Power and Legal Language (Quorum: London, 1988) at pp.25-27. 35 C Mellish, "Logic Programming and Expert Systems" Chapter Nineteen in KS Gill (ed.), ArtificialIntelligence for Society (John Wiley & Sons: London, 1986) at p.211. 36 F Hayes-Roth, DA Waterman & DB Lenat, Building Expert Systems (Addison-Wesley: London,1983) at p.66. 37 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.208. 38 R Forsyth, "The Anatomy of Expert Systems" Chapter Eight in M Yazdani, Artificial Intelligence:Principles and Applications (Chapman and Hall: 1986) p.191. 39 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)pp.209-210. 40 RI Levine, DE Drang & B Edelson, Artificial Intelligence and Expert Systems (McGraw-Hill: 1990)Chapter Six, particularly at pp.62-65. 41 e.g. AW Koers & D Kracht, "A Goal Driven Knowledge Based System for a Domain of PrivateInternational Law" (1991) Proceedings Third International Conference on Artificial Intelligence and Law

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

20 of 37 19.12.08 11:59

Page 21: e Law_ Legal Expert Systems_no Way..

81. 42 q.v. RA Kowalski, "The Treatment of Negation in Logic Programs for Representing Legislation"(1989) Proceedings Second International Conference on Artificial Intelligence and Law 11; P Asirelli, MDe Santis & M Martelli, "Integrity Constraints in Logic Databases" (1985) 2 Journal of LogicProgramming 221; K Eshghi & RA Kowalski, "Abduction Compared with Negation by Failure" (1989)Proceedings of the Sixth International Logic Programming Conference; and JW Lloyd, EA Sonenbergand RW Topot, "Integrity Constraint Checking in Stratified Databases" (1986) 4 Journal of LogicProgramming 331. 43 TJM Bench-Capon, "Representating Counterfactual Conditionals" (1989) Proceedings ArtificialIntelligence and the Simulation of Behvaiour 51. 44 "Augmented Prolog Expert System"; q.v. MJ Sergot, "A Brief Introduction to Logic Programmingand Its Applications in Law" Chapter Five in C Walter (ed.), Computer Power and Legal Language (Quorum: London, 1988) at pp.34-35. 45 P Hammond & MJ Sergot, "A PROLOG Shell for Logic Based Expert Systems" (1983) 3Proceedings British Computer Society Expert Systems Conference. 46 DA Schlobohm, "A PROLOG Program Which Analyses Income Tax Issues under Section 318(a) ofthe Internal Revenue Code" in C Walter (ed.), Computing Power and Legal Reasoning (West Publishing:St Paul, 1985) p.765. 47 q.v. F Haft, RP Jones & T Wetter, "A Natural Language Based Legal Expert System forConsultation and Tutoring - The LEX Project" (1987) Proceedings First International Conference onArtificial Intelligence and the Law 75. 48 C Walter, "Elements of Legal Language" Chapter Three in C Walter (ed.), Computer Power andLegal Language (Quorum: London, 1988). 49 LA Zadeh, "Fuzzy Sets" (1965) 8 Information and Control 338. 50 E Rosch & C Mervis, "Family Resemblances: Studies in the Internal Structure of Categories" (1975)7 Cognitive Psychology 573. 51 M Novakowska, "Fuzzy Concepts: Their Strcuture and Problems of Measurement" in MM Gupta,RK Ragade & RR Yager (eds), Advances in Fuzzy Set Theory and Applications (North-Holland:Amsterdam, 1979) at p.361. 52 TJM Bench-Capon & MJ Sergot, "Toward a Rule-Based Representation of Open Texture in Law"Chapter Six in C Walter (ed.), Computer Power and Legal Language (Quorum: London, 1988) at p.49. 53 D Berman & C Walter (ed.), "Toward a Model of Legal Argumentation" Chapter Four in C Walter(ed.), Computer Power and Legal Language (Quorum: London, 1988) at p.22. 54 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.225. 55 CE Alchourrsn & AA Martino, "A Sketch of Logic Without Truth" (1989) Proceedings SecondInternational Conference on Artificial Intelligence and Law 165 at p.166. 56 HLA Hart, "Problems of the Philosophy of the Law" in HLA Hart, Essays in Jurisprudence andPhilosophy (Clarendon Press: Oxford,

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

21 of 37 19.12.08 11:59

Page 22: e Law_ Legal Expert Systems_no Way..

1983) p.100; and H Kelsen, "Law and Logic" in H Kelsen, Essays in Legal and Moral Philosophy (Reidel: Dordrecht, 1973) at p.229. 57 LT McCarty, "Permissions and Obligations - an Informal Introduction" (1983) ProceedingsInternational Joint Conference on Artificial Intelligence-83; LT McCarty, "Permissions and Obligations - An Informal Introduction" in AA Martino & NF Socci (eds) Automated Analysis of Legal Texts (North-Holland: Amsterdam, 1986). Fora more developed system on the same principles, see H-N Castaqeda,"The Basic Logic for the Interpretation of Legal Texts" in C Walter (ed.), Computer Power and LegalLanguage (Quorum: London, 1988) at p.167. 58 GHv Wright, "Deontic Logic" (1951) 60 Mind 1. 59 McCarty suggests that it is helpful to think of the set as an "oracle" to be consulted whencontemplating a course of action; see LT McCarty, "Permissions and Obligations - an Informal Introduction" in AA Martino & NF Socci (eds) Automated Analysis of Legal Texts (North-Holland:Amsterdam, 1986). 60 Note LT McCarty, "Permissions and Obligations - A Informal Introduction" in AA Martino & NFSocci (eds) Automated Analysis of Legal Texts (North-Holland: Amsterdam, 1986)Definitions 5-7. 61 LT McCarty, "Clausal Intuitionistic Logic I: Fixed-Point Semantics" (1988) 5 Journal of LogicProgramming 1; LT McCarty, "Clausal Intuitionistic Logic II: Tableau Proof Procedures" (1988) 5 Journal of Logic Programming 93. 62 LT McCarty, "On the Role of Prototypes in Appellate Legal Argument" (1991) Proceedings ThirdInternational Conference on Artificial Intelligence and Law 185 at p.187. 63 AJI Jones, "On the Relationship Between Permission and Obligation" (1987) Proceedings FirstInternational Conference on Artificial Intelligence and Law 164 at pp.166-168. 64 LT McCarty & Rvd Meyden, "Indefinite Reasoning with Definite Rules" (1991) Proceedings of theTwelfth International Joint Conference on Artificial Intelligence. 65 Particularly those of consequential closure; AJI Jones & I Pren, "Ideality, Sub-Ideality and DeonticLogic" (1985) 2 Synthise 65. 66 R Stamper, "The LEGOL-1 Prototype System and Language" (1977) 20 The Computer Journal 102. 67 R Stamper, C Tagg, P Mason, S Cook & J Marks, "Developing the LEGOL Semantic Grammar" inC Ciampi (ed.) Artificial Intelligence and Legal Information Systems (North-Holland: Amsterdam, 1982)p.357. 68 R Stamper, "LEGOL: Modelling Legal Rules by Computer" in B Niblett (ed.), Computer Scienceand Law (Cambridge University Press: New York, 1980) p.45. 69 R Stamper, "A Non-Classical Logic for Law Based on the Structures of Behaviour" in AA Martino& F Socci (eds), Automated Analysis of Legal Texts (North-Holland: Amsterdam, 1986) p.57. 70 S Jones, "Control Structures in Legislation" in B Niblett (ed.), Computer Science and Law

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

22 of 37 19.12.08 11:59

Page 23: e Law_ Legal Expert Systems_no Way..

(Cambridge University Press: New York, 1980) p.157. 71 MJ Sergot, Programming Law: LEGOL as a Logic Programming Language (Imperial College:London, 1980). 72 MJ Sergot, "The Representation of Law in Computer Programs", Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications (Academic Press: London, 1991) at p.35. 73 IE Pratt, Epistemology and Artificial Intelligence (PhD dissertation: Princeton, 1987) p.18;emphasis in original. 74 Susskind and Gold. 75 Including Bench-Capon, Cordingley, Forder, Frohlich, Gilbert, Luff, Protman, Sergot, Storrs andTaylor; q.v. RN Moles, "Logic Programming - An Assessment of Its Potential for Artificial IntelligenceApplications in Law" (1991) 2 Journal of Law and Information Science 137 at pp.146-147. 76 RE Susskind, "The Latent Damage System" (1989) Proceedings Second International Conference onArtificial Intelligence and Law 23 at p.29. 77 On causality, note CG de'Bessonet & CR Cross, "Representation of Some Aspects of Causality" in CWalter (ed.) Computing Power and Legal Reasoning (West: St Paul, 1985) pp.205-214. 78 e.g. SR Goldman, MG Dyer & M Flowers, "Precedent-based Legal Reasoning and KnowledgeAcquisition in Contract Law: a Process Model" (1987) Proceedings First International Conference onArtificial Intelligence and Law 210 at pp.214-215 using Hohfeldian analysis of rights; q.v. WN Hohfeld, "Some Fundamental LegalConceptions As Applied in Judicial Reasoning" (1917) 23 Yale Law Journal 16. 79 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.227. 80 P Leith, "Logic, Formal Models and Legal Reasoning" (1984) Jurimetrics Journal 334 atpp.335-336. 81 N MacCormick, Legal Reasoning and Legal Theory (Oxford University Press: Oxford, 1978) atp.37. 82 MJ Sergot, "A Brief Introduction to Logic Programming and Its Applications in Law" Chapter Fivein C Walter (ed.), Computer Power and Legal Language (Quorum: London, 1988) at p.26. 83 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987) pp.181-198, particularly at p.188. 84 LE Allen & CS Saxon, "Some Problems in Designing Expert Systems to Aid Legal Reasoning"(1987) Proceedings First International Conference on Artificial Intelligence and Law 94 at p.94. 85 RE Susskind, "The Latent Damage System" (1989) Proceedings Second International Conference onArtificial Intelligence and Law 23 at p.28. 86 Ibidem p.30.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

23 of 37 19.12.08 11:59

Page 24: e Law_ Legal Expert Systems_no Way..

87 AvdL Gardner, "Overview of an AI Approach to Legal Reasoning" in C Walter (ed.),ComputingPower and Legal Reasoning (West: St Paul, 1985) p.247. 88 HLA Hart, "Positivism and the Separation of Law and Morals" (1958) 79 Harvard Law Review 593at p.599. 89 G Gottlieb, The Logic of Choice: An Investigation of the Concepts of Rule and Rationality (Allen &Unwin: London, 1968) p.17. 90 OC Jensen, The Nature of Legal Argument (Basil Blackwell: Oxford, 1957) p.16; A Wilson, "The Nature of Legal Reasoning: A Commentary with Special Reference toProfessor MacCormick's Theory" (1982) 2 Legal Studies 269 at pp.278-280. 91 MJ Detmold, The Unity of Law and Morality: A Refutation of Legal Positivism (Routledge & KeganPaul: London, 1984) p.15; c.f. RE Susskind, "Detmold's Refutation of Positivism and the ComputerJudge" (1986) 49 Modern Law Review 125. 92 HLA Hart, "Positivism and the Separation of Law and Morals" (1958) 79 Harvard Law Review 593at p.607. 93 DB Skalak, "Taking Advantage of Models for Legal Classification" (1989) Proceedings SecondInternational Conference on Artificial Intelligence and Law 234. 94 CD Hafner, "Conceptual Organisation of Case Law Knowledge Bases" (1987) Proceedings FirstInternational Conference on Artificial Intelligence and Law 35 at pp.36-37. 95 Note the diagram attached to RE Susskind, "The Latent Damage System" (1989) ProceedingsSecond International Conference on Artificial Intelligence and Law 23. 96 MJ Sergot, "Representing Legislation as Logic Programs" (1985) 11 Machine Intelligence 209. 97 MJ Sergot, F Sadri, RA Kowalski, F Kriwaczek, P Hammond, & HT Cory, "The British NationalityAct as a Logic Program" (1986) 29 Communications of the ACM 370; and MJ Sergot, HT Cory, PHammond, RA Kowalski, F Kriwaczek, & F Sadri, "Formalisation of the British Nationality Act" in CArnold (ed.), Yearbook of Law, Computers and Technology (Butterworths: London, 1986). 98 TJM Bench-Capon, GO Robinson, TW Routen & MJ Sergot, "Logic Programming for Large ScaleApplications in Law: A Formalisation of Supplementary Benefit Legislation" (1987) Proceedings First International Conference on Artificial Intelligence and Law 190. 99 q.v. P Johnson & D Mead, "Legislative Knowledge Base Systems for Public Administration: SomePractical Issues" (1991) Proceedings Third International Conference on Artificial Intelligence and Law 74. 100 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.100.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

24 of 37 19.12.08 11:59

Page 25: e Law_ Legal Expert Systems_no Way..

101 The concept can be attributed to Quillian; q.v. MR Quillian, "Word Concepts: A Theory andSimulation of Some Basic Semantic Capabilities" (1967) 12 Behvioural Science 410. 102 WA Woods, "What's in a Link: Foundations for Semantic Networks" in DG Bobrow & AM Collins(eds), Representation and Understanding: Studies in Cognitive Science (Academic Press: New York,1975) p.32. 103 M Minsky, "A Framework for Representing Knowledge" in J Haugeland (ed.), Mind Design (MITPress: Cambridge, 1981) p.95. 104 PJ Hayes, "The Logic of Frames" in D Metzing (ed.), Frame Conceptions and Text Understanding(Walter de Gruyter: Berlin, 1979) p.46. 105 MJ Sergot, "The Representation of Law in Computer Programs", Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications (Academic Press: London, 1991) at p.48. 106 LT McCarty, "Reflections on TAXMAN: An Experiment in Artificial Intelligence and LegalReasoning" (1977) 90 Harvard Law Review 837; and LT McCarty, "The TAXMAN Project: Towards aCognitive Theory of Legal Argument" in B Niblett (ed.), Computer Science and Law (CambridgeUniversity Press: New York, 1980). 107 PJ Hayes, "The Logic of Frames" in D Metzing (ed.), Frame Conceptions and Text Understanding(de Gruyter: New York, 1979). 108 Example adapted from MJ Sergot, "The Representation of Law in Computer Programs", ChapterOne in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications (Academic Press: London,1991) at p.46. 109 KE Sanders, "Representing and Reasoning About Open-Textured Predicates" (1991) ProceedingsThird International Conference on Artificial Intelligence and Law 137 at p.138. 110 LT McCarty & NS Sridharan, "The Representation of an Evolving System of Legal Concepts II:Prototypes and Deformations" (1987) Proceedings of the Seventh International Joint Conference on Artificial Intelligence 246. 111 TJM Bench-Capon & MJ Sergot, "Toward a Rule-Based Representation of Open Texture in Law"Chapter Six in C Walter (ed.), Computer Power and Legal Language (Quorum: London, 1988) at p.47. 112 SS Weiner, "Reasoning About "Hard" Cases in Talmudic Law" (1987) Proceedings FirstInternational Conference on Artificial Intelligence and Law 222 at p.223. 113 P Leith, "Logic, Formal Models and Legal Reasoning" (1984) Jurimetrics Journal p.334 at p.356. 114 KA Lambert & MH Grunewald, "LESTER: Using Paradigm Cases in a Quasi-Prcedential LegalDomain" (1989) Proceedings Second International Conference on Artificial Intelligence and Law 87. 115 J Popple, "Legal Expert Systems: The Inadequacy of a Rule-based Approach" (1991) 23 AustralianComputer Journal 11 at p.15. 116 Also note the GREBE system; q.v. LK Branting, "Representing and Reusing Explanations of LegalPrecedents" (1989) Proceedings Second International Conference on Artificial Intelligence and Law 103.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

25 of 37 19.12.08 11:59

Page 26: e Law_ Legal Expert Systems_no Way..

117 RA Kowalski, "Case-based Reasoning and the Deep Structure Approach to KnowledgeRepresentation" (1991) Proceedings Third International Conference on Artificial Intelligence and Law 21at p.23. 118 KD Ashley & EL Rissland, "Waiting on Weighting: a Symbolic Least Commitment Approach"(1988) Proceedings American Association for Artificial Intelligence. 119 MJ Sergot, "The Representation of Law in Computer Programs", Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications (Academic Press: London, 1991) at p.65. 120 J Zeleznikow, Building Intelligent Legal Tools - The IKBALS Project (1991) 2 Journal of Law andInformation Science 165 at p.173. 121 EL Rissland & KD Ashley, "A Case-Based System for Trade Secrets Law" (1987) ProceedingsFirst International Conference on Artificial Intelligence and Law 60. 122 M Betzer, "Legal Resoning in 3-D" (1987) Proceedings First International Conference on ArtificialIntelligence and Law 155 at p.155. 123 RA Kowalski, "Case-based Reasoning and the Deep Structure Approach to KnowledgeRepresentation" (1991) Proceedings Third International Conference on Artificial Intelligence and Law21. 124 RA Kowalski, "Case-based Reasoning and the Deep Structure Approach to KnowledgeRepresentation" (1991) Proceedings Third International Conference on Artificial Intelligence and Law 21at p.26. 125 G Greenleaf, A Mowbray & AL Tyree, "Expert Systems in Law: The DATALEX Project" (1987)Proceedings First International Conference on Artificial Intelligence and Law 9 at p.12. 126 e.g. SR Goldman, MG Dyer & M Flowers, "Precedent-based Legal Reasoning and KnowledgeAcquisition in Contract Law: a Process Model" (1987) Proceedings First International Conference onArtificial Intelligence and Law 210; and MT MacCrimmon, "Expert Systems in Case-Based Law: The Hearsay Rule Adviser" (1989)Proceedings Second International Conference on Artificial Intelligence and Law 68. 127 G Vossoss, J Zeleznikow & T Dillon, "Combining Analogical and Deductive Reasoning in LegalKnowledge Base Systems - IKBALS II" in Cv Noortwijk, AHJ Schmidt & RGF Winkels (eds), LegalKnowledge Based Systems: Aims for Research and Development (Koninklijke: Lelystad, 1991) p.97 at p.100. 128 The weighting scheme used by Kowalski was: Highest level court = 70; appeal level court = 50;trial level court = 30. Add 10 points for trial or appeals local to the jurisdiction. Deduct 15 points forforeign jurisdictions, except England, then 10 points. Add 1 to 5 points if case is recent: 1986 = 1 to 1990= 5. See RA Kowalski, Case-based Reasoning and the Deep Structure Approach to KnowledgeRepresentation (1991) Proceedings Third International Conference on Artificial Intelligence and Law 21. 129 G Vossos, T Dillon & J Zeleznikow, "The Use of Object Oriented Principles to Develop Intelligent

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

26 of 37 19.12.08 11:59

Page 27: e Law_ Legal Expert Systems_no Way..

Legal Reasoning Systems" (1991) 23 Australian Computer Journal 2. 130 J Zeleznikow & D Hunter, "Rationales for the Continued Development of Legal Expert Systems"(1992) 3 Journal of Law and Information Science 94 at pp.102-103. 131 RE Susskind, "Expert Systems in Law: A Jurisprudential approach to Artificial Intelligence andLegal Reasoning" (1986) 49 Modern Law Review 168 at p.171; see also RE Susskind, Expert Systems inLaw: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987) p.20. 132 RA Kowalski, "Case-Based Reasoning and the Deep Structure Approach to KnowledgeRepresentation" (1991) Proceedings Third International Conference on Artificial Intelligence and Law 21at p.21. 133 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987) pp.81-82. 134 Ibidem pp.21-23. 135 TJM Bench-Capon & J Forder, "Knowledge Representation for Legal Applications" ChapterTwelve in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications (Academic Press:London, 1991) at p.249. 136 HJ Levesque & RJ Brachman, "A Fundamental Tradeoff in Knowledge Representation andReasoning" Chapter Four in RJ Brachman & HJ Levesque (eds), Readings in Knowledge Representation(Morgan Kaufmann: Los Altos, 1985) at pp.66-67. 137 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.49. 138 q.v. TJM Bench-Capon & F Coenen, "Exploiting Isomorphism: Development of a KBS to SupportBritish Coal Insurance Claims" (1991) Proceedings Third International Conference of ArtificialIntelligence and Law 62. 139 RN Moles, "Logic Programming - An Assessment of Its Potential for Artificial IntelligenceApplications in Law" (1991) 2 Journal of Law and Information Science 137 at p.144. 140 Bench-Capon's own words; q.v. TJM Bench-Capon & J Forder, "Knowledge Representation forLegal Applications" Chapter Twelve in TJM Bench-Capon, Knowledge-Based Systems and LegalApplications (Academic Press: London, 1991) at p.259. 141 e.g. C Biagioli, P Mariani & D Tiscornia, "ESPLEX: a Rule and Conceptual Based Model forRepresenting Statutes" (1987) Proceedings First International Conference on Artificial Intelligence andLaw 240, and the examples previously cited. 142 e.g. DM Sherman, "A Prolog Model of the Income Tax Act of Canada" (1987) Proceedings FirstInternational Conference on Artificial Intelligence and Law 127; also note TAXMAN and like projects cited. 143 B Niblett, "Computer Science and Law: An Introductory Discussion" in B Niblett (ed.), ComputerScience and Law (Cambridge University Press: Cambridge, 1980) at pp.16-17.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

27 of 37 19.12.08 11:59

Page 28: e Law_ Legal Expert Systems_no Way..

144 For instance into ANF (Atomically Normalised Form) used with the CCLIPS system (Civil CodeLegal Information Processing System); q.v. G Cross, CGde Bossonet, T Bradshaw, G Durham, R Gupta& M Nasiruddin, "The Implementation of CCLIPS" Chapter Nine in C Walter (ed.), Computer Powerand Legal Language (Quorum: London, 1988) p.90. 145 JP Dick, "Conceptual Retrieval and Case Law" (1987) Proceedings First International Conferenceon Artificial Intelligence and Law 106 at p.109; although such material may classify as a source of heuristics. Susskind does not addressthis, however. 146 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987) pp.84-85; citing HLA Hart, The Concept of Law (Clarendon Press: Oxford, 1961) p.131. 147 KE Sanders, "Representing and Reasoning About Open-Textured Predicates" (1991) ProceedingsThird International Conference on Artificial Intelligence and Law 137 at p.142. 148 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.61; contra G Greenleaf, A Mowbray & AL Tyree, "Expert Systems in Law: The DATALEX Project"(1987) Proceedings First International Conference on Artificial Intelligence and Law 9. 149 c.f. LB Solum, "On the Indeterminacy Crisis: Critiquing Critical Dogma" (1987) 54 University ofChicago Law Review 462. 150 J Boyle, "Anatomy of a Torts Class" (1985) 34 American University Law Review 131; see also MKelman, "Trashing" (1984) 36 Stanford Law Review 293. 151 A Mason, "Future Directions in Australian Law" (1987) 13 Monash Law Review 149, particularlyat pp.154-155 and p.158; FG Brennan, "Judicial Method and Public Law" (1979) 6 Monash Law Review12; and M McHugh, "The Law-making Function of the Judicial Process" (1988) 62 Australian LawJournal 15. 152 e.g. JC Smith and C Deedman, "The Application of Expert Systems Technology to Case-BasedReasoning" (1987) Proceedings First International Conference on Artificial Intelligence and Law 84. 153 E Levi, An Introduction to Legal Reasoning (University of Chicago Press: Chicago, 1949) pp.3-5. 154 AL Goodhart, "The Ratio Decidendi of a Case" (1930) 40 Yale Law Journal 161. 155 J Stone, "The Ratio of the Ratio Decidendi" in Lord Lloyd & MDA Freeman, Lloyd's Introductionto Jurisprudence (5th Ed.) (Stevens: London, 1985) p.1164. 156 "It is unclear". 157 S Strvmholm, Rdtt. rottskollor och rottssystem (3rd Ed.) (Norstedts: Stockholm, 1987) cited by PWahlgren, "Legal Reasoning - A Jurisprudence Description" (1989) Proceedings Second International Conference on Artificial Intelligence and Law 147 at p.148. 158 TJM Bench-Capon & F Coenen, "Practical Application of KBS to Law: The Crucial Role ofMaintenance" in Cv Noortwijk, AHJ Schmidt & RGF Winkels (eds), Legal Knowledge Based Systems:Aims for Research and Development (Koninklijke: Lelystad, 1991) p.5.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

28 of 37 19.12.08 11:59

Page 29: e Law_ Legal Expert Systems_no Way..

159 P Bratley, J Frimont, E Mackaay & D Poulin, "Coping with Change" (1991) Proceedings ThirdInternational Conference on Artificial Intelligence and Law 69. 160 e.g. P Hammond, "Representation of DHSS Regulations as a Logic Program" (1983) Proceedingsof the 3rd British Computer Society Expert Systems Conference 225; and the Estate Planning System;q.v. DA Schlobohm & DA Waterman, "Explanation for an Expert System that Performs EstatePlanning" (1987) Proceedings First International Conference on Artificial Intelligence and Law 18. 161 RA Kowalski & MJ Sergot, "The Use of Logical Models in Legal Problem Solving" (1990) 3 RatioJuris 201 at p.207. 162 DA Schlobohm & LT McCarty, "EPS II: Estate Planning With Prototypes" (1989) ProceedingsSecond International Conference on Artificial Intelligence and Law 1. 163 P Bratley, J Frimont, E Mackaay & D Poulin, "Coping with Change" (1991) Proceedings ThirdInternational Conference on Artificial Intelligence and Law 69. 164 AvdL Gardner, "Representing Developiong Legal Doctrine" (1989) Proceedings SecondInternational Conference on Artificial Intelligence and Law 16 at p.21. 165 Ibidem p.19. 166 A narrow definition of "information" is a common criticism of modern expert systems; q.v. HLDreyfus & SE Dreyfus, Mind over Machine (Basil Blackwell: Oxford, 1986); T Roszak, "The Cult of Information" (Pantheon: London, 1986); DR Hofstadter, Metamagical Themas (Penguin Press: London,1985); and T Winograd & F Flores, Understanding Computers and Cognition: A New Foundation forDesign (Ablex: Norwood, 1986). 167 RN Moles, "Logic Programming - An Assessment of Its Potential for Artificial IntelligenceApplications in Law" (1991) 2 Journal of Law and Information Science 137 at p.144. 168 C Biagioli, P Mariani & D Tiscornia, "ESPLEX: a Rule and Conceptual Based Model forRepresenting Statutes" (1987) Proceedings First International Conference on Artificial Intelligence andLaw 240 at p.241. 169 C Smith & C Deedman, "The Application of Expert Systems Technology to Case-BasedReasoning" (1987) Proceedings First International Conference on Artificial Intelligence and Law 84 at p.87. 170 RE Susskind, "Expert Systems in Law - Out of the Research Laboratory and into the Marketplace"(1987) Proceedings First International Conference on Artificial Intelligence and Law 1 at p.2. 171 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.78. 172 JC Smith & C Deedman, "The Application of Expert Systems Technology to Case-BasedReasoning" (1987) Proceedings First International Conference on Artificial Intelligence and Law 84 at p.85.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

29 of 37 19.12.08 11:59

Page 30: e Law_ Legal Expert Systems_no Way..

173 Approximately 50 texts and 100 articles; q.v. RE Susskind, "Expert Systems in Law - Out of theResearch Laboratory and into the Marketplace" (1987) Proceedings First International Conference on Artificial Intelligence and Law 1 at p.2; note the similarity to the issue of epistemology of law. 174 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.27. 175 For instance, Dworkin's "principles"; q.v. RM Dworkin, Taking Rights Seriously (Duckworth:London, 1977), RM Dworkin, A Matter of Principle (Harvard University Press: London, 1985), and RMDworkin, Law's Empire (Fontana: London, 1986). 176 RE Susskind, "Expert Systems in Law - Out of the Research Laboratory and into the Marketplace"(1987) Proceedings First International Conference on Artificial Intelligence and Law 1 at p.3. 177 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.254. 178 B Niblett, "Expert Systems for Lawyers" (1981) 29 Computers and Law 2 at p.3. 179 PJ Hayes, "On the Differences Between Psychology and Artificial Intelligence" in M Yazdani & ANarayanan, Artificial Intelligence: Human Effects (Ellis Horwood: London, 1984) p.158. 180 DR Hofstader, Gvdel, Escher, Bach: An Eternal Golden Braid (Harvester Press: New York, 1979)p.578. 181 RA Kowalski, "Leading Law Students to Uncharted Waters and Making them Think: TeachingArtificial Intelligence and Law" (1991) 2 Journal of Law and Information Science 185 at p.187 nt.5. 182 D Brown, "The Third International Conference on Artificial Intelligence and Law: Report andComments" (1991) 2 Journal of Law and Information Science 233 at p.238. 183 B Niblett, "Expert Systems for Lawyers" (1981) 29 Computers and Law 2 at p.3. 184 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.7. 185 P Leith, "Clear Rules and Legal Expert Systems" in AA Martino & F Socci (eds), AutomatedAnalysis of Legal Texts (North-Holland: Amsterdam, 1986) p.661; and P Leith, "Fundamental Errors inLegal Logic Programming" (1986) 3 The Computer Journal 29. 186 LT McCarty, "Some Requirements for a Computer-based Legal Consultant" (Research Report:Rutgers University, 1980) at pp.2-3, cited in RN Moles, Definition and Rule in Legal Theory: A Reassessment of HLA Hart and the Positivist Tradition (Basil Blackwell: Oxford, 1987) p.269; emphasisadded. 187 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987) p.53. 188 MA Boden, Artificial Intelligence and Natural Man (Basic Books: New York, 1977), cited in DPartridge, "Social Implications of Artificial Intelligence" Chapter Thirteen in M Yazdani (ed.), ArtificialIntelligence: Principles and Applications (Chapman & Hall: London, 1986) at p.326. See also D Partridge, Artificial Intelligence: Applications in the Future of Software

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

30 of 37 19.12.08 11:59

Page 31: e Law_ Legal Expert Systems_no Way..

Engineering (Ellis Horwood, Chichester, 1986). 189 RN Moles, "Logic Programming: An Assessment of its Potential for Artificial IntelligenceApplications in Law" (1991) 2 Journal of Law and Information Science 137 at p.161. 190 TJM Bench-Capon, "Deep Models, Normative Reasoning and Legal Expert Systems" (1989)Proceedings Second International Conference on Artificial Intelligence and Law 37 at p.37. 191 Glauoma diagnosis system. 192 LT McCarty, "Intelligent Legal Information Systems: Problems and Prospects" in CM Campbell(ed.), Data Processing and the Law (Sweet & Maxwell: London, 1984) p.126. 193 SS Weiner, "Reasoning About "Hard" Cases in Talmudic Law" (1987) Proceedings FirstInternational Conference on Artificial Intelligence and Law 222 at p.223. 194 MJ Sergot, HT Cory, P Hammond, RA Kowalski, F Kriwacek & F Sadri, "Formalisation of theBritish Nationality Act" (1986) 2 Yearbook of Law Computers and Technology; and TJM Bench-Capon,GO Robinson, TW Routen & MJ Sergot, "Logic Programming for Large Scale Applications in Law"(1987) Proceedings First International Conference on Artificial Intelligence and Law 190. 195 JC Smith & C Deedman, "The Application of Expert Systems Technology to Case-BasedReasoning" (1987) Proceedings First International Conference on Artificial Intelligence and Law 84 at p.85. 196 RA Kowalski, Case-based Reasoning and the Deep Structure Approach to KnowledgeRepresentation (1991) Proceedings Third International Conference on Artificial Intelligence and Law 21at p.22. 197 LT McCarty & NS Sridharan, "The Representation of an Evolving System of Legal Concepts II:Prototypes and Deformations" (1987) Proceedings of the Seventh International Joint Conference on Artificial Intelligence 246 at p.250. 198 D Makinson, "How to Give it Up: A Survey of Some Formal Aspects of the Logic of TheoryChange" (1985) 62 Synthise 347. 199 P Bratley, J Frimont, E Mackaay & D Poulin, "Coping with Change" (1991) Proceedings ThirdInternational Conference on Artificial Intelligence and Law 69 at p.74. 200 See RM Dworkin, Law's Empire (Fontana: London, 1986) pp.250-254 on the difficulty in"compartmentalization" of the law. 201 LT McCarty, "Some Requirements for a Computer-based Legal Consultant" (Research Report:Rutgers University, 1980) cited in MJ Sergot, "The Representation of Law in Computer Programs",Chapter One in TJM Bench-Capon, Knowledge-Based Systems and Legal Applications (Academic Press:London, 1991) at pp.46-47. 202 P Leith, "Logic, Formal Models and Legal Reasoning" (1984) Jurimetrics Journal p.334 at p.356.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

31 of 37 19.12.08 11:59

Page 32: e Law_ Legal Expert Systems_no Way..

203 NE Simmonds, "Between Positivism and Idealism" (1991) 50 Cambridge Law Journal 308 atpp.312-313. 204 M Minsky, "A Framework for Representing Knowledge" in J Haugeland (ed.), Mind Design (MITPress: Cambridge, 1981) p.95 at p.100. 205 It is also ironic in light of Hart's alleged reliance on Wittgenstein's linguistic philosophy; q.v.Cotterrell, The Politics of Jurisprudence: A Critical Introduction to Legal Philosophy (Butterworths:London, 1989) pp.89-90. 206 J Vaux, "AI and Philosophy: Recreating Naive Epistemology" Chapter Seven in KS Gill (ed.),Artificial Intelligence for Society (John Wiley & Sons: London, 1986) p.76.; q.v. L Wittgenstein, Philosophical Investigations (Basil Blackwell: London, 1953). 207 RA Kowalski & MJ Sergot, "The Uses of Logical Models in Legal Problem Solving" (1990) 3 RatioJuris 201 at p.205. 208 See HLA Hart, "Definition and Theory in Jurisprudence" (1954) 70 Law Quarterly Review 37. 209 L Fuller, "Positivism and Fidelity to Law - A Reply to Professor Hart" (1958) 71 Harvard LawReview 630 at p.666. 210 Raz's approach to adjudication shares similar characteristics; q.v. J Raz, "The Problem about theNature of Law" (1983) 31 University of Western Ontatio Law Review 202 at pp.213-216. 211 HLA Hart, The Concept of Law (Clarendon Press: Oxford, 1961) p.126. 212 KR Popper, Conjectures and Refutations (4th Ed.) (Routledge and Kegan Paul: London, 1972)p.46. 213 JW Harris, Law and Legal Science (Clarendon Press: Oxford, 1979) p.166. 214 M Polyani, Personal Knowledge - Towards a Post-Critical Philosophy (Routledge and Kegan Paul:London, 1958). 215 DC Berry, "The Problem of Implicit Knowledge" (1987) 4 Expert Systems 144. 216 DC Berry & A Hart, "The Way Forward" in DC Berry & A Hart (eds) Expert Systems: HumanIssues (MIT: Cambridge, 1990) p.256. 217 E Husserl, Cartesian Meditations (Martinus Nijhoff: The Hague, 1960) pp.54-55. 218 B MacLennan, "Logic for the New AI" in JH Fetzer (ed.), Aspects of Artificial Intelligence(Kluwer: Dordrecht, 1988) at p.163. 219 C Hempel, Philosophy of Natural Science (Prentice Hall: London, 1966). 220 A Narayanan, "Why AI Cannot be Wrong" Chapter Five in KS Gill (ed.), Artificial Intelligence forSociety (John Wiley & Sons: London, 1986) at p.48.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

32 of 37 19.12.08 11:59

Page 33: e Law_ Legal Expert Systems_no Way..

221 P Davies, "Living in a Non-Maaterial World - the New Scientific Consciousness" (1991) TheAustralian (9th October) pp.18-19 at p.19. 222 RS Pound, "Mechanical Jurisprudence" (1908) 8 Columbia Law Review 605. 223 J Searle, "Minds, Brains and Programs" (1980) 3 Behavioural and Brain Sciences 417. 224 RE Susskind, Expert Systems in Law: A Jurisprudential Inquiry (Clarendon Press: Oxford, 1987)p.241. 225 RHS Tur, "Positivism, Principles, and Rules" in E Atwool (ed.), Perspectives in Jurisprudence(University of Glascow Press: Glascow, 1997) at p.51. 226 EL Rissland & DB Skalak, "Interpreting Statutory Predicates" (1989) Proceedings SecondInternational Conference on Artificial Intelligence and Law 46 at p.46. 227 MJ Detmold, "Law as Practical Reason" (1989) 48 Cambridge Law Journal 436 at p.460. 228 Ibidem p.439. 229 L Fuller, "Positivism and Fidelity to Law - A Reply to Professor Hart" (1958) 71 Harvard LawReview 630 at p.663; see also RS Summers, "Professor Fuller on Morality and Law" in RS Summers(ed.), More Essays on Legal Philosophy: General Assessment of Legal Philosophies (Basil Blackwell:Oxford, 1971) at pp.117-119. 230 F Schauer, Playing by the Rules: A Philosophical Examination of Rule-based Decision-Making inLaw and in Life (Clarendon Press: Oxford, 1991) at pp.59-60. 231 HLA Hart, The Concept of Law (Clarendon Press: Oxford, 1961) p.56. 232 H Williamson, "Some Implications of Acceptance of Law as Rule Structure" (1967) 3 AdelaideLaw Review 18 at pp.42-43. 233 c.f. A Glass, "Interpretive Practices in Law and Literary Criticism" (1991) 7 Australian Journal ofLaw & Society 16. 234 DN Herman, "Phenomonology, Structuralism, Hermeneutics, and Legal Study: Applications ofContemporary Continental Though to Legal Phenomena" (1982) 36 University of Miami Law Review379. 235 P Linzer, "Precise meaning and Open Texture in Legal Writing and Reading" Chapter Two in CWalter (ed.), Computer Power and Legal Language (Quorum: London, 1988). 236 M Weait, "Swans Reflecting Elephants: Imagery and the Law" (1992) 3 Law and Critique 59 atp.66. 237 P Gabel & P Harris, "Building Power and breaking Images: Critical Legal Theory and the Practiceof Law" (1982-83) 11 Review of Law & Social Change 369 at p.370.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

33 of 37 19.12.08 11:59

Page 34: e Law_ Legal Expert Systems_no Way..

238 HL Dreyfus, "From Micro-Worlds to Knowledge Representation: AI at an Impasse" in J Haugeland(ed.), Mind Design (MIT Press: Cambridge, 1981) p.161 at p.170. 239 DG Bobrow & T Winograd, "An Overview of KRL, A Knowledge Representation Language"(1977) 1 Cognitive Science 3 at p.32. 240 C Fried, "Sonnet LXV and the 'Black Ink' of the Framer's Intention" (1987) 100 Harvard LawReview 751 at pp.757-758. 241 J Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation (WHFreeman & Co: San Francisco, 1976) cited in D Partridge, "Social Implications of Artificial Intelligence"Chapter Thirteen in M Yazdani (ed.), Artificial Intelligence: Principles and Applications (Chapman &Hall: London, 1986) at pp.330-331. 242 Contra note MA Boden, "AI and Human Freedom" in M Yazdani & A Narayanan (eds), ArtificialIntelligence: Human Effects (Ellis Horwood: Chichester, 1984). 243 C Tapper, "Lawyers and Machines" (1963) 26 Modern Law Review 121. 244 Ibidem p.128. 245 c.f. TJM Bench-Capon, "Deep Models, Normative Reasoning and Legal Expert Systems" (1989)Proceedings Second International Conference on Artificial Intelligence and Law 37 at p.42. 246 J Zeleznikow, "Building Intelligent Legal Tools - The IKBALS Project" (1991) 2 Journal of Lawand Information Science 165. 247 e.g. DE Wolstenholme, "Amalgamating Regulation and Case-based Advice Systems throughSuggested Answers" (1989) Proceedings Second International Conference on Artificial Intelligence andLaw 63. 248 c.f. R Wright, "The Cybernauts have Landed" (1991) Law Institute Journal 490 at p.491. 249 C Tapper, "Lawyers and Machines" (1963) 26 Modern Law Review 121 at p.126. 250 q.v. PJ Ward, "Computerisation of Legal Material in Australia" (1982) 1 Journal of Law andInformation Science 162. 251 J Bing, "The Text Retrieval System as a Conversation Partner" in C Arnold (ed.) Yearbook of Law,Computers and Technology (Butterworths: London, 1986) p.25. 252 G Greenleaf, "Australian Approaches to Computerising Law - Innovation and Integration" (1991)65 Australian Law Journal 677. 253 SJ Latham, "Beyond Boolean Logic: Probabilistic Approaches to Text Retrieval" (1991) 22 TheLaw Librarian 157. 254 J Bing, "Legal Text Retrieval Systems: The Unsatisfactory State of the Art" (1986) 2 Journal ofLaw and Information Science 1 at pp.16-17.

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

34 of 37 19.12.08 11:59

Page 35: e Law_ Legal Expert Systems_no Way..

255 RM Tong, CA Reid, GJ Crowe & PR Douglas, "Conceptual Legal Document Retrieval Using theRUBRIC System" (1987) Proceedings First International Conference on Artificial Intelligence and Law 28; and J Bing, "Designing Text Retrieval Systems for 'Conceptual Searching'" (1987) Proceedings FirstInternational Conference on Artificial Intelligence and Law 43. 256 J Kolodner, "Maintaining Organisation in a Dynamic Long-Term Memory" (1983) 7 CognitiveScience; CD Hafner, "Conceptual Organisation of Case Law Knowledge Bases" (1987) ProceedingsFirst International Conference on Artificial Intelligence and Law 35. 257 T Mitchell, "Learning and Problem Solving" (1983) Proceedings of International Joint Conferenceon Artificial Intelligence. 258 DE Rose & RK Belew, "Legal Information Retrieval: A Hybrid Approach" (1989) ProceedingsSecond International Conference on Artificial Intelligence and Law 138. 259 CD Hafner, "Conceptual Organisation of Case Law Knowledge Bases" (1987) Proceedings FirstInternational Conference on Artificial Intelligence and Law 35. 260 LT McCarty, "On the Role of Prototypes in Appellate Legal Argument" (1991) Proceedings ThirdInternational Conference on Artificial Intelligence and Law 185 at p.186. 261 TJM Bench-Capon & MJ Sergot, "Toward a Rule-Based Representation of Open Texture in Law"Chapter Six in C Walter (ed.), Computer Power and Legal Language (Quorum: London, 1988) at p.58. 262 KD Ashley, "Toward a Computational Theory of Arguing with Precedents: AccomodatingMultiple Interpretations of Cases" (1989) Proceedings Second International Conference on Artificial Intelligence and Law 99; and KD Ashley & EL Rissland, "But See, Accord: Generating 'Blue Book'Citations in HYPO" (1987) Proceedings First International Conference on Artificial Intelligence and Law 67. 263 EL Rissland, "Examples in Legal Reasoning: Legal Hypotheticals" (1983) Proceedings EighthInternational Joint Conference on Artificial Intelligence 90; EL Rissland & EM Soloway, "Overview of an Example Generation System" (1980) Proceedings First Annual National Conference on ArtificialIntelligence; and EL Rissland, EM Valcarce & KD Ashley, "Explaining and Arguing with Examples"(1984) Proceedings National Conference on Artificial Intelligence. 264 CC Marshall, "Representing the Structure of a Legal Argument" (1989) Proceedings SecondInternational Conference on Artificial Intelligence and Law 121. On the structure of legal argument seeS Toulmin, The Uses of Argument (Cambridge University Press: Cambridge, 1958); S Toulmin, RD Reike, & A Janik, An Introduction to Reasoning (MacMillan Press: New York,1979); and C Perelman, The Idea of Justice and the Problem of Argument (Routledge & Kegan Paul:London, 1963). 265 KD Ashley & EL Rissland, "Toward Modelling Legal Argument" in AA Martino & F Socci (eds),

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

35 of 37 19.12.08 11:59

Page 36: e Law_ Legal Expert Systems_no Way..

Automated Analysis of Legal Texts (North-Holland: Amsterdam, 1986) at p.19; also KD Ashley,"Toward a Computational Theory of Arguing with Precedents" (1989) Proceedings Second InternationalConference on Artificial Intelligence and Law 93. 266 EL Rissland, "Learning How to Argue: Using Hypotheticals" (1984) Proceedings First AnnualConference on Theoretical Issues in Conceptual Information Processing; EL Rissland, "Argument Movesand Hypotheticals" in C Walter (ed.), Computing Power and Legal Reasoning (West Publishing: St Paul,1985). 267 RH Michaelson, "An Expert System for Federal Tax Planning" (1984) 1 Expert Systems 2. 268 e.g. the Retirement Pension Forecast and Advice System (relying on the Aion Development Systemshell) q.v. S Springel-Sinclair & G Trevena, "The DHSS Retirement Pension Forecast and Advice System"in P Duffin (ed.) Knowledge Based Systems: Applications in Administrative Government (EllisHorwood: Chichester, 1988). 269 G De Jong, "Towards a Model of Conceptual Knowledge Acquisition Through DirectedExperimentation" (1983) Proceedings of International Joint Conference on Artificial Intelligence. 270 Legal Decision-making System; q.v. DA Waterman & MA Peterson, "Rule-based Models of LegalExpertise" (1980) Proceedings First Annual National Conference on Artificial Inelligence 272; and DA Waterman & MA Peterson, "Evaluating Civil Claims: An Expert Systems Approach" (1984) ExpertSystems 1. 271 q.v. DA Waterman, RH Anderson, F Hayes-Roth, P Klahr, G Martins & SJ Rosenschein, Design ofa Rule-Oriented System for Implementing Expertise (Rand Corporation: Santa Monica, 1979). 272 System for Asbestos Litigation; q.v. DA Waterman, J Paul & MA Peterson, "Expert Systems forLegal Decision Making" (1986) 4 Expert Systems 212. 273 G Greenleaf, "Australian Approaches to Computerising Law - Innovation and Integration" (1991)65 Australian Law Journal 677 at p.679. 274 SS Nagel & R Barczyk, "Can Computers Aid the Dispute Resolution Process?" (1988) 71Judicature 253. 275 q.v. WM Bain, Toward a Model of Subjective Interpretation (Department of Commerce ResearchReport: Yale University, 1984) cited in MJ Sergot, "The Representation of Law in Computer Programs" Chapter One in TJM Bench-Capon (ed.), Knowledge-Based Systems and Legal Applications (AcademicPress: London, 1991) p.16. 276 S Torrance, "Breaking out of the Chinese Room" in M Yazdani (ed.), Artificial Intelligence:Principles and Applications (Chapman & Hall: London, 1986) at p.301. 277 TW Bynum, "Artificial Intelligence, Biology, and Intentional States" (1985) 16 Metaphilosophy355. 278 T Cuda, "Against Neural Chauvanism" (1985) 48 Philosophical Studies 111. 279 DE Rumelhart, JL McClelland and the PDP Research Group, Parallel Distributed Processing:

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

36 of 37 19.12.08 11:59

Page 37: e Law_ Legal Expert Systems_no Way..

Explorations in the Microstructure of Cognition (MIT Press: Cambridge, 1986). 280 AL Tyree, "The Logic Programming Debate" (1992) 3 Journal of Law and Information Science1111 at p.115. 281 q.v. RA Clarke, Knowledge-Based Expert Systems: Risk Factors and Potentially ProfitableApplication Areas(Working paper: Department of Commerce, Australian National University, 1988). 282 M Aultman, "Technology and the End of Law" (1972) 17 American Journal of Jurisprudence 46 atpp.49-52.

Document author: Andrew GreinkeDocument creation: December 1994HTML last modified: December 1994Authorised by: Archie Zariski, Managing Editor, E LawDisclaimer & Copyright Notice © 2001 Murdoch UniversityURL: http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

E Law: LEGAL EXPERT SYSTEMS - A HUMANISTIC CRITIQUE ... http://www.murdoch.edu.au/elaw/issues/v1n4/greinke14.html

37 of 37 19.12.08 11:59