Applied Legal Philosophy series, Ashgate Publishing, Farnham, 2009.
Ashgate's announcement of the book.
This paper discusses a notorious criminal case from The Netherlands. Nurse Lucia de B. was convicted for seven murders and three attempted murders with the penalty of life imprisonment. Yet there was little evidence apart from the fact that at the hospital where it all started, quite a number of reanimations happened during the shifts of the nurse. Then, after one more reanimation it was felt that that all this could not be just a coincidence. Very early in the process a quantification was given to the uneasy feeling that proportionally there were too many reanimations during the nurse’s shifts. The probability that such a coincidence could have happened by mere accident, was calculated by one expert as 1 in 342.000.000. With that number out in the open the general notion was that what had happened could not be just an accident. The nurse definitely had to be a serial killer, even in the absence of any further evidence. It is argued that this idea was in fact the driving force in the whole judicial process. It coloured the perception of everybody, including that of the prosecution and the courts. And it fabricated a whole series of incriminating facts which inexorably led to the in fact completely unwarranted conclusion against Lucia de B. Important general lessons may be learned from this. Throughout the process three dangerous reasoning instincts may be seen at work (1) the Small Chance Instinct (2) the Smoke and Fire Instinct, and (3) human inclination to neglect base rates (in not-everyday situations).
This chapter discusses rational and probabilistic models of forensic decision making in individuation problems. In such problems a forensic scientist is confronted with a trace (for example, a fingerprint, a handwriting, a DNA sample, or a bullet) and has to determine its source (a specific person, a specific firearm). Traditional forensic approaches are criticised for requiring that identification and elimination should be categorical and therefore reasoning involved should be deductive. Against this it is claimed that this kind of reasoning has to be inductive, since when determining the source of a trace it is practically impossible to compare the source with the entire population of all potential sources. Therefore probabilistic assessments are inevitable. Accordingly a Bayesian model of factual judgement is proposed: forensic experts should provide conditional probabilities on the relevance of evidence to certain hypotheses, while the fact finder should estimate the prior probabilities of these hypotheses and combine them with the expert’s conditional ones to calculate the impact of new evidence on the hypotheses’ posterior probability. It is also argued that in order to avoid misunderstandings it is essential that the expert reports his findings to the adjudicator in a rationally correct format. Traditional conclusion formats are argued to be logically flawed and misleading. A more plausible conclusion format is suggested.
This paper focuses on the story-based approach to reasoning about the facts of a case. After a discussion of previous work on the role of stories in legal evidential reasoning, the notion of story schemes as a tool to assess story plausibility is developed. Stories are argued to be plausible if their facts fit story schemes. A story scheme is a description of typical event structures, such as the typical facts of a robbery or, more generally, an intentional crime. It is noted that story schemes can be more or less abstract and he emphasizes that it is important to match a story with the most specific available story scheme. For instance, a robbery story’s plausibility is best established using a story scheme for robberies, and less so by using a story scheme for intentional crimes. This account of story schemes is also used to explicate the role of causal generalisations that connect causally related facts in a story, and it is noted how story schemes have different uses, e.g., for story analysis (in decision making), hypothesis generation (in investigation) and for building persuasive stories (in pleading). focuses on the story-based approach to reasoning about the facts of a case. After a discussion of previous work on the role of stories in legal evidential reasoning, the notion of story schemes as a tool to assess story plausibility is developed. Stories are argued to be plausible if their facts fit story schemes. A story scheme is a description of typical event structures, such as the typical facts of a robbery or, more generally, an intentional crime. It is noted that story schemes can be more or less abstract and he emphasizes that it is important to match a story with the most specific available story scheme. For instance, a robbery story’s plausibility is best established using a story scheme for robberies, and less so by using a story scheme for intentional crimes. This account of story schemes is also used to explicate the role of causal generalisations that connect causally related facts in a story, and it is noted how story schemes have different uses, e.g., for story analysis (in decision making), hypothesis generation (in investigation) and for building persuasive stories (in pleading).
This article discuss two broad types of evaluating evidence within criminal cases. One of them concerns a ‘holistic’ way of looking at the various items of evidence and coming to conclusions about how convincing they are in terms of their probative value. The use of stories, comparing different accounts of what happened, and looking at events as a whole, are central to this model. The other way in which evidence is evaluated is atomistic, whereby each item of evidence is weighed and scrutinized independently from the other evidence. It represents, in principle, a bottom-up way of coming to conclusions regarding what happened in a case. Advantages and disadvantages of each approach are discussed. Attention is paid to two archetypes of legal systems: the adversarial and the inquisitorial. Taking the Netherlands and Australia as examples of the two types of legal systems, this chapter examines those characteristics that can be expected to contribute to a preference for either a holistic way or an atomistic way of evaluating evidence. An answer is sought to the question which type of legal system would prefer to use one of the modes of evidence evaluation over the other. Some legal systems may be more inclined to a holistic approach than others, and vice versa. Some cautious recommendations are made.
This chapter discusses the combination of story-based approaches with AI and cognitive science models of inference to the best explanation. It is argued that most arguments about facts in law are instances of ‘inference to the best explanation,’ that is, patterns of inference whereby explanatory hypotheses are formed and evaluated. A coherentist interpretation of inference to the best explanation is offered, according to which reasoning about facts in law involves first the generation of several plausible explanations of the evidence at trial and then the selection, among them, of the one that is best on a test of explanatory coherence. It is shown how the explanationist model of legal proof proposed would deal with the O.J. Simpson case. Next, there is discussion of a major criticism against a model of inference to the best explanation in law, namely, the objection from the bad lot, which says that the best available explanation might still be a bad one. Against this it is noted that the investigate phase in which the explanations are constructed is governed by principles of epistemic responsibility and that respecting these principles increases the chance that the best explanation available is rationally acceptable.
Verheij and Bex contribute one more piece of the puzzle how story-based and argument-based approaches to legal evidentiary reasoning are related. The main result of this chapter is a set of argumentation schemes, reconstructing anchored narratives theory. Such schemes are a kind of semi-formal analogues of the rules of inference of logic. An argumentation scheme specifies conditions that can support a conclusion and possibly also exceptions that can apply and conditions of use for the scheme. The central argumentation scheme of the set makes explicit how a story can be accepted as true according to the anchored narratives theory: the story must be good and the story must be well-anchored. An important exception to the rule that good, well-anchored stories can be accepted as true is availability of another good story, with equally good or perhaps even better anchoring. Thus the argumentation schemes developed in the chapter lead to some refinements and clarifications of the anchored narratives theory. In particular, it is explained how accepting a story about the facts of a crime can depend recursively on stories about pieces of evidence. It is proposed that the acceptance of the truth of a story about a piece of evidence should be treated in the same way as the acceptance of the truth of a story about the crime facts, but using different measures of the plausibility of such stories (in the form of different story structures).
This paper discusses the often belittled but in fact fundamental conflict between anomism (evidence and proof are to be established any way, as long as they conform to relevant facts) and proceduralism (legally relevant facts are to be established by authoritative procedure). Anomism is shown to be related to strict distinctions between contexts of discovery and contexts of justification, related in their turn to ideals of monotonic reasoning. Proceduralism on the other hand implies dependence of justification on procedure, with attendant relevance of defeasible reasoning. Several objections against anomism are discussed, ranging from the problem of the argumentum ad ignorantiam and doubtful objectivist ontological and epistemological presuppositions to its practical impossibility. At first sight this would seem to reinstate proceduralism and its basic idea of constructed ‘reality’. Still, such ‘pure’ proceduralism is shown to come down to giving up the idea of reasonable conflict resolution as such. What is left to be done, then, is devising rational procedures for unearthing the truth of the matter, or: proceduralism as one important (heuristic) tool in reconstructing reality.
Prakken & Sartor give an argument-based analysis of various notions of burden of proof. Their starting point is the claim that logics for defeasible argumentation provide the means to logically characterise the difference between several kinds of proof burdens, but only if they are embedded in a dynamic setting that captures the various stages of a legal proceeding. It is also argued that ‘standard’ argumentation logics for AI must be adapted in order to model shifts in burdens of proof. Thus this analysis illustrates in two ways that logics cannot be simply imposed on the law but that features of legal systems must be taken into account. First there is the claim that the burden of persuasion, which legally is the burden to prove a statement to a specified degree (the standard of proof) on the penalty of losing on the issue, can be verified by applying an argumentation logic to the evidence available at the final stage of a proceeding. Then a precise distinction is made between two burdens that are sometimes confused, namely the burden of production and the tactical burden. In this analysis, the tactical burden of proof is automatically induced by the defeasible nature of the reasoning. The burden of production, by contrast, concerns the legal question whether an issue can be submitted to trial or must be decided as a matter of law against the one who fails to produce any evidence. Finally the issue is raised to what extent this account can be generalised to statistical and story-based approaches.
This article aims at tentative exploration of a legal theory of rational argumentation, that is a theory that takes the rules of legal procedure as a starting point and aims to describe the way in which they function in a legal argument. The relationship between rational and legal procedural principles of evidentiary reasoning is addressed, taking Kaptein’s distinction between anomism and proceduralism (see chapter 8) as a starting point. It is claimed that when these clash, it is not always the law that is at fault. Indeed it is argued that rational inquiry in the natural sciences may not be the paradigm of fact finding in the law. Typically legal rational values may be at least as important. Legal-procedural rules should not be seen as contingent extra-rational add-ons to rational models, merely meant to promote extra-rational legal values such as fairness and due process. Instead it is argued that several such constraints enhance rational truth-seeking, since they restore the epistemic asymmetry that sets legal trial apart from scientific debate. The difference shows itself most clearly in criminal trials, where the defendant has privileged access to the truth, while the prosecution has privileged access to the means of proof. Legal procedure aims at restoring the epistemic symmetry between the parties by, for instance, rules on burden of proof and pretrial disclosure rules. Because of the epistemic asymmetry of legal procedure it is also argued that legal evidential reasoning is not about what is true but about what can be proven to be true. Against Prakken and Sartor in this volume and elsewhere it is put forward that this can be modelled as deductive instead of defeasible inference since the question what has been proven is not addressed until the end of a proceeding, at which point no new evidence can be introduced, which excludes any further defeat.