Philosophical Foundations of Evidence Law
One of the most characteristic developments in philosophy in recent decades is the field’s increasing fragmentation. By that I mean the tendency to move away from generic questions (e.g.: Does justified true belief amount to knowledge? Under which circumstances are we virtuous?) towards a more focussed, context-specific investigation of various issues: philosophy of psychology, philosophy of biology, philosophy of neurosciences etc. The results have hitherto been promising, with seminal treatises clearing the noise in convoluted areas, identifying methodological or conceptual issues, or clarifying meaningful (and acceptable) uses of terms which scientists may (not) use to make factual claims etc. It is, therefore, hardly surprising that the lively interest on, and discussions about the law of evidence with particular emphasis on criminal evidence is more vivid now than it has ever been. From the general public’s fascination with criminal law to the abundance of empirical domains (psychology, sociology, criminology, neurosciences, forensic sciences etc.) which aspire to apply their solutions to the criminal process, criminal evidence occupies centre stage both in folklore and in academic literature.
It is not entirely clear whether the multi-facetted terms ‘theory’ and ‘philosophy’ are regarded as co-extensive from the editors of the ‘Philosophical Foundations of Evidence Law’ (hereafter: PFEL); the fact remains that its editors explain that the common interest of every contributor is ‘evidence theory as related to law’ (p. 3; all page references without further details refer to PFEL). The reader can thus wonder whether PFEL intends to investigate matters from a rather neutral philosophical (by that I mean: analytic) perspective, and to illuminate concepts by clarifying the ramifying web of connections between basic doctrinal tools in the law of evidence as it is, or whether PFEL offers instead a top-down normative approach to the law of evidence as it should be – in the PFEL authors’ opinion. That remains to be seen in the content of PFEL which comprises 26 chapters divided in 7 separate categories: 1) Evidence, Truth, and Knowledge, 2) Law and Factfinding 3) Evidence, Language, and Argumentation 4) Evidence and Explanation 5) Evidence and Probability 6) Proof Paradoxes 7) Biases and Epistemic Injustice. This book review will only highlight a small number of conceptual and methodological issues that not only permeate PFEL but characterise discussions in academic disciplines at the periphery of evidence law.
The primary diagnosis provided by PFEL’s editors is worthy of attention: ‘Evidence theory as related to law’, they contend (p. 3),
‘stayed dormant until the advent of the “New Evidence Scholarship” in the mid-eighties of the twentieth century. Before that time, a small number of scattered, yet remarkable, works by John Kaplan, Per Olof Ekelöf […] and others have identified and rationalized the alleged alignments and misalignments between mathematical probability and adjudicative factfinding’ (emphasis added).
Although no one would doubt that the law of evidence is commonly reduced to the rules of evidence, the picture of a hibernating evidence scholarship awaiting the wake-up call from proponents of mathematical models of proof (i.e. the New Evidence Scholarship) cannot hold water – for a simple reason. For example, one of the most prominent and central procedural devices in the law evidence, i.e. the standard of ‘proof beyond reasonable doubt’, is the direct result of a long evolutionary process of epistemological inquiry, procedural adaptation, and doctrinal borrowing. At the latest during the 17th century there was a concentrated attempt to conceptualise different categories of ‘certainty’ and ‘knowledge’ and to show the practicality of an intermediate epistemological category between erga omnes (absolute) certainty and mere (anything-goes) opinion. One of the subcategories of ‘certainty’ which was regarded as distinct from the mathematical one was that of moral certainty. The scholasticist John Wilkins, i.e. one of many who grappled with questions of evidence & proof, wrote in 1675 that moral certainty pertains to situations ‘in which there can be no natural necessity, that things be so, and they cannot possibly be otherwise […] yet may they be so certain as not to admit of any reasonable doubt concerning them’; John Wilkins, Of the Principles and Duties of Natural Religion. London 1675, p. 7-8 – emphasis added). That was in other words the realisation that, in the words of one of the fathers of the Enlightenment project, Cesare Beccaria, ‘[t]he certainty which is necessary to decide that the accused is guilty, is the very same which determines every man in the most important transactions of his life’ (see Cesare Beccaria, On Crimes and Punishments, 2009, Ch. XIV). It is thus unjustifiable to describe evidence scholarship as ‘dormant’ insofar, as the most basic question, i.e. the standard of proof, had been negotiated and settled already in the 17th century and has been in procedural action at the latest since the Boston Massacre trials of 1770 (see Anthony A. Morano, A Reexamination of the Development of the Reasonable Doubt Rule. In: 55 B.U. L. Rev. 507 (1975)). In the words of Barbara J. Shapiro ‘[once] the epistemological problem is solved, the institutional problem is solved’ (Barbara J. Shapiro, “Beyond Reasonable Doubt” and “Probable Cause“. Berkeley 1991, p. 46). And (what we at the cost of oversimplification might call) the Anglo-American legal orders qua complex adaptive systems have gone to great lengths to solve the most crucial institutional problem of all albeit without reverting to formalism and legal rules of proof.
The PFEL editors’ diagnosis about the New Evidence Scholarship including the latter’s attempt to map rationality (and reasonableness) onto logicality misses the most important lesson from the history of the law of evidence. For example, Dahlman and Kolflaath (pp. 287-300) take for granted that the criminal standard of ‘proof beyond reasonable doubt’ sets a ‘probabilistic threshold’ or that the presumption of innocence – which in fact is a complex doctrinal device granting the accused the default normative status of presumed innocence – is connected with ‘the probability of the prosecutor’s hypothesis at the start of the trial’ (p. 289, see also Spottswood’s approach, pp. 107-122). The (criminal) standard of proof is, as I showed above, the result of a collective effort to take distance from formal methods of proof yielding absolute (mathematical) knowledge and purportedly eliminating discretion. By failing to understand the structure and history of evidence law we are doomed to repeat grave mistakes of the past. Even despite the absence of a clearly articulated conceptual framework, the system of criminal adjudication mainly by extensively employing the doctrinal device of ‘reasonableness’ – the law is overly reliant thereupon – acquired the institutional know-how to resolve legal conflicts. Legal orders have their own established routines to validate criminal charges. Note that this is not a proto-scientific-apparatus, but an advanced institutional tool designed for particular kinds of work in the social arena. The law of evidence is not a half-baked routine for folk-validation due to be replaced by the rigorous method protocols of natural scientists – see for example: 1) trial by economics, in: Ch. 10 by Fisher, pp. 137-153. 2) trial by mathematics, in: Ch. 18 by Fenton and Lagnado, pp. 267-286. 3) trial by psychology, in: Ch. 23 by Sevier, pp. 349-363, where Sevier claims that ‘empirical psychology is a natural fit for understanding the law of evidence’ (p. 349) and that ‘there is a philosophical disconnect between evidence law and the field of empirical psychology (p. 350).
Normative impositions on the law of evidence need to be wary of and attentive to the procedural architecture of the respective legal order. The idea that some scientifically validated (therefore: general) proposition could ever guarantee the factual and normative rectitude of a criminal verdict commits the fallacy of making inferential steps based on assumptions that are not valid vis-à-vis the underlying scientific model. It is through this lens that PFEL will be examined.
My book review’s introduction runs the risk of being pedantic and disproportionately long. However, any therapeutic method – the authors aspire to ‘increase the understanding of the factfinding process taking place in the courts of law’ (p. 6) – is a function of a methodologically valid and accurate diagnosis.
The first part of PFEL deals with fundamental epistemological concepts, and discusses topics as diverse as the aim of the criminal process (see Ch. 1, pp. 11-24), naturalized approaches to evidence (see Ch. 2, pp. 25-39), the problem of giving reasons for decisions (see Ch. 3, pp. 40-52) or the issues around expert witness testimony (see Ch. 4, pp. 53-68).
The blurring of boundaries between the ‘Is’ and the ‘Ought’ – a dichotomy which is a presupposition for the very ability to conceptualise the law of evidence – is replicated in Part I which according to the editors ‘presents the core ideas as to what evidence law is about: […] discovery of the truth and the resulting accuracy of verdicts’ (p. 1). PFEL does not focus on any jurisdiction in particular but as far as the jurisdiction of England and Wales is concerned, the said proposition is false. The overriding objective of the criminal process is that ‘criminal cases be dealt with justly’ (Crim PR 2020, 1.1. England and Wales). And ‘justice’ –from a legal point of view– is far wider and deeper a concept than factual accuracy. Notwithstanding, we find useful discussions on fundamental concepts of the criminal process. What are its goals? What are the epistemic duties of decision-makers? Do the latter need to provide (written) justification for their verdict? What is the role of the expert? What in my opinion is missing, is a careful docking of evidential topics to their philosophical foundations.
Undeniably, it is essential that Ho shows how ‘truth’ in a legal context is ‘vastly different from a scientific or historical inquiry’ (p. 16). Ho does remind readers that discussions about ‘truth’ comprise two distinct questions, a) what is the meaning of truth?, and b) what is the criterion of truth? (p. 13). Whilst both are important for philosophers, only the latter is relevant and consequential for the law of evidence. The law of evidence cannot afford a discussion on or perhaps does not even care whether truth means correspondence with an objective reality, coherence etc. The crucial question is under which circumstances a fact-finder can justifiably ascribe criminal liability, which brings us back to the ancient Agrippa’s trilemma and the problem of the structure of justification. Again, the law of evidence does not grapple directly with philosophical problems like the distinction between foundationalist, coherentist (see Amaya’s chapter, pp. 231- 247, see also my book review for Amalia Amaya, The Tapestry of Reason, Hart Publishing 2015, in: 103 ARSP 2017, pp. 431-435) and contextualist theories of justification. On the contrary, it is prominent philosophers like Ludwig Wittgenstein who take a long hard look – “Don’t think, but look!”, exhorts the Austrian-British philosopher (see his Philosophical Investigations (trans. by GEM Anscombe), Basil Blackwell, 1958, para 66) – at legal proceedings to give a philosophical account of ‘practical certainty’, ‘reasonable doubts’ etc. (see Ludwig Wittgenstein, On Certainty, Oxford 1969, para 261, 416, 607, especially 335. See also K.N. Kotsoglou, Forensische Erkenntnistheorie, Berlin 2015).
The ‘epistemic’ objectives of the criminal process are themselves normatively constituted. The liberal values of modern legal orders (and inversely: the authoritative values of other legal orders) manage to exemplify this point. The criminal process does not aim primarily at factual accuracy (see also Ho, p. 12-13). Although the procedural framework of the criminal process works towards the minimisation of a certain type of risk, its goals cannot be assessed against empirical reality. In other words, we cannot use the term veritistic for the simple reason that the truth value of the (propositional content of the) criminal verdict is not settleable via external means. A veritistic approach to evidence (see Broughton and Leiter’s analysis, 2, pp. 25-39) is therefore seriously flawed. When a hundred-meter race is finally run, the photo finish will inform us about the winner. When a population group has been vaccinated, the number of (short-term) side-effects will inform us whether the cure is better than the disease. In criminal adjudication, however, we only have and require sufficient grounds for the purposes of determining criminal liability (see Bertràn’s analysis, pp. 40-52). ‘Sufficiency’ or even the very term ‘facts’, as I said before, are normatively constructed. When Wahlberg and Dahlman write that the presumption of innocence is related to ‘the prior probability of guilt’ (p. 59), they approach a procedural device from an empirical perspective. The phrase ‘prior probability of guilt’ is not meaningful.
The reader would also expect that a chapter discussing the issue of the proper role of the expert witness (Wahlberg and Dahlman, pp. 53-66) would do more than summarising the mainstream account according to which Daubert was ‘inspired by Carl Hempel and Karl Popper’ (p. 61). This may have been the case, but as far as I can see both Hempel (with his verificationist account) and Popper (with his falsificationist account) are proponents of a now obsolete account of scientific theories (syntactic view of theories). The latter is the view that empirical knowledge a) is a product of an empirically uninterpreted formal calculus based on formal logic, and b) can be compared to reality. However, since the last quarter of the 20th century, philosophers of science inform us that it is rather scientific models that ‘occupy central stage’ (B. van Fraassen, The Scientific Image, Oxford 1980, p. 44). For the arguably ‘major philosophical insight’ recovered by the new paradigm in philosophy of science is that ‘statements of physical theory are not, strictly speaking, statements about the physical world’ (R.I.G. Hughes, Models and Representation. In: 64 Philosophy of Science 1996, pp. S325-S336). It would make thus more sense to explore what ‘science’ means to further explore the symbiotic relationship of ‘science and law’. This would get us beyond platitudes such as ‘the expert must not answer questions of law’. For, technically, the expert cannot answer questions about legal facts either. For example, no anatomist can inform a jury in the context of a rape case about the meaning of the term ‘vagina’ (in England and Wales, a vulva is also part of ‘vagina’s’ semantics in the context of rape, see s. 79(9) SOA 1993). Remember: the empirical (actus reus and mens rea) elements of offences or defences are themselves normatively constituted.
In Part II (‘Law and Factfinding’, pp. 69-153) of PFEL, Schauer inter alia helps the reader move away from the Benthamite image of a laissez-passer fact-finding method (which in view of ethical regimes in scientific research would not describe research conduct in natural sciences either) by drawing attention to exclusionary rules. Schauer contrasts exclusionary rules in common law countries with the ‘system of free proof’ (p. 70) in civil law systems. He claims that
most civil law countries persist in something resembling what has come to be known as the free proof tradition, the substantially rule-free approach to considering the facts that bear on the existence of criminal or civil liability (p. 69).
Whilst comparative analysis is useful for a deeper understanding of a system’s architecture, caution is needed especially when one claims that ‘[e]xclusionary rules thus represent an approach diametrically opposed to the free proof idea’ (p. 73). Vis-à-vis continental jurisdictions (e.g. the German speaking ones), we need to differentiate between free assessment of evidence (freie Beweiswürdigung) and free proof (Freibeweis). The former means nothing but the absence of evidentiary standards which specify the quantity and quality of proof required for ascribing guilt (e.g. a full confession or two good witnesses). The latter means the absence of procedural restrictions for adducing evidence. In continental jurisdictions it simply is not true that the ‘free proof tradition’ consists in a ‘substantially rule-free approach to considering the facts’ (p. 69). For one, the fact-finder in German-speaking countries is not equipped with unbridled discretion. The process of adducing evidence for proving elements of substantive law is subject to an –unsurprisingly– austere (Strengbeweis) and thickly regulated procedural framework (see para. 244-257 German CCrPr). For another, exclusionary rules (Beweisverbote) are widely present and can be conceptualised as a constraint to the inquisitorial default-rule. Although terminology may vary, a basic distinction needs to be made among exclusionary rules prohibiting certain types of evidence (Beweiserhebungsverbote), those prohibiting the use of illegally generated evidence (Beweisverwertungsverbote) – with further distinctions to be made between prohibitions regarding particular areas (Beweisthemenverbote), prohibitions regarding certain means of generating evidence (Beweismittelverbote), and prohibitions regarding specific methods (Beweismethodenverbote). Exclusionary rules and a structured way of proving the elements of offences/defences (Strengbeweis) are part and parcel of the law of evidence in continental jurisdictions. Comparative analysis cannot afford to neglect that.
Further conceptualisation of exclusionary rules is followed by Holroyd and Picinali who focus on the issue of (moral) integrity (pp. 83–95). The focus shifts then to the probative value of naked statistical evidence with Alex Stein making insightful connections between personal autonomy and the need for ‘maximal individualized scrutiny’ of the evidence which should thus not ‘be purely statistical’ (p. 97).
In a similar through-the-vegetable-garden-fashion, Part III of PFEL grapples with questions of evidence, language, and argumentation. Floris Bex (pp. 183–197) draws our attention to the inferential structure of reasoning patterns and ‘evidential arguments’ with particular emphasis on ‘defeasible reasoning’. He submits that in the context of witness testimony assessment ‘generalizations are often “default rules”, which means that if we have no reason not to believe the witness, we can draw conclusions from their testimony’ (p. 184). Although it is unclear what is meant by that (can we not draw conclusions whenever we disbelieve the witness?), defeasibility seems to capture a crucial aspect of legal reasoning. We have much to learn, he suggests, because while ‘legal reasoning […] seems different from evidential reasoning […] the reasoning mechanisms are very much related’ (p. 186). I shall come back to this point. Defeasible arguments, Bex contends, ‘allow us to explicitly link the evidence to (legal) conclusions’ (p. 193). For defeasibility warrants, he adds, ‘an argument [to] be rebut […] by new information which leads to, for example, an argument for the opposite conclusion or an exception to a generalization’ (p. 187).
A couple of remarks are in order at this juncture. Legal reasoning flows into a decision, not into a conclusion. And jurors ascribe criminal liability, they don’t infer the latter from the evidence. This is not a point about semantics but pertains to the procedural architecture of the criminal process. The question is, as Bex correctly identifies, what is the argumentative structure of this decisional process. Whilst defeasibility is a very good candidate, the necessary conceptual (and doctrinal) vocabulary is missing from Bex’s chapter. What can be ‘rebutted’ is, technically, not the ‘argument’ but the presumed fact which is by default inferred from the presumption-raising fact, unless one of the specified defeaters applies. The ‘argument’ is the procedural space of reasons in which all this takes place in the same way that for example the presumption of innocence authorises both a conviction (when sufficient proof has been adduced) or an acquittal (in any other case).
Then, there is the issue with the rather artificial and unwarranted distinction between ‘evidential reasoning’ on the one hand and ‘legal reasoning’ on the other hand. These two in Bex’s view separate domains can share reasoning mechanisms (p. 186). This misses however a crucial aspect of the law. For example, the defendant will be acquitted, unless there is sufficient evidence against him. No fact-finder will deal with or need to rule out every possible defence unless one of the latter becomes a live issue at trial. What is more, by making a defence a live procedural issue the defendant against whom sufficient proof of an offence has been brought, is entitled to an acquittal, unless the prosecution disproves the existence of the defence to the requisite standard of proof. According to English criminal law (ss. 74-75 Sexual Offences Act 2003), a person who was asleep or otherwise unconscious at the time of the relevant sexual act [presumption raising fact] is taken not to have consented to the relevant act [presumed fact] unless sufficient evidence is adduced to raise an issue as to whether that person consented [defeater]. We see that defeasibility characterises not only local presumptions (e.g. consent or fatherhood) but permeates the very fabric of the criminal process (see presumption of innocence and default-syntax of defences).
It is thus not irrelevant that the legal theorist H.L.A. Hart is widely considered as the father of defeasibility (In Hart’s later retracted paper, The Ascription of Responsibility and Rights. 49 Proceedings of the Aristotelian Society (1948-1949), pp. 171-194, we find the first clear pronouncement of the term ‘defeasibility’). For Hart looked closely at law’s language games, excavated, and coined the concept of ‘defeasibility’ which is part and parcel of the legal system. Remember: Experience void of concepts is ‘only intellectually blind, not inoperable’ (D. Moyall-Sharrock, Understanding Wittgenstein’s On Certainty. Palgrave Macmillan 2007, p. 205). Legal orders qua complex adaptive systems have thus been structured (I dare say: bottom-up) in a defeasible way. It would be thus helpful for logicians to clearly articulate these reasoning patterns, but the main point here is that legal officials including fact-finders already operate within defeasible reasoning structures – whether it becomes clear to them or not. By separating legal reasoning from its (allegedly) evidential counterpart only to reunite them through defeasible reasoning patterns, we fail to understand that (the law of) evidence including evidential reasoning is by definition a jurisprudential domain.
A considerable percentage of criminal trials rely heavily on expert evidence. Naturally, discussions around forensic science with particular emphasis on the validity and use of forensic evidence could not be missing. Briefly, but concisely, Taroni, Bozza, and Biedermann openly question the empirical basis and underlying probabilistic framework of forensic science, understood as ‘a body of scientific principles and technical methods applied within well-defined proceedings in criminal or civil law’ (p. 251) – with forensic genetics being the only exception. Taroni et al. raise serious questions about mainstream areas of forensic science which examine shoe marks, tool marks, gunshot residues, handwriting etc. The authors make clear that the examination of such traces can hardly be anything more than an ‘ad hoc assessment’ (p. 252). Consequently, the authors make the case for the introduction of a rigorous probabilistic framework (p. 254). The word/name ‘Bayes’ is in the air, although the authors are careful enough to clarify that there is ‘no suggestion that the responsibility for reasoning and decision-making processes is to be delegated to an abstract mathematical theory’ (p. 254). Amen to that!
Bayes’ Theorem is derivable from a standard definition of conditional probability and the axioms of probability. It simply does not follow that there can be a meaningful (jurisprudential) objection to a mathematical theorem. This is not just a matter of words insofar, as Fenton and Lagnado (pp. 267-286) criticise lawyers’ alleged ‘reluctance to accept Bayes in the law’. They contend that our said ‘reluctance’ is to be viewed as a symptom of ‘a long-time historical reticence to accept any statistical analysis as valid evidence’ (p. 269). But as I mentioned above, Bayes’ theorem qua piece of mathematics is uncontroversial. Applicable or subject to criticism (depending on one’s views) is only Bayesianism which is not a theorem but an epistemology. Whereas Bayes’ theorem shows how different things relate synchronically, it is Bayesianism that proposes how we should revise (diachronically) our degree of belief with new evidence. What is more, we should deploy, Fenton and Lagnado seem to suggest, statistical analysis to consider a proposition such as ‘O.J. Simpson either did or did not murder his wife’ (p. 267). That would be of course doable, there is alas a big problem. The proposition ‘A murdered B’ is not an empirical one to be formally analysed insofar, as we cannot reduce the legal term ‘murder’ to the physical act of, say, stabbing someone in the chest. Any meaningful and ambitious conversation on establishing smooth communication between fact-finders and experts in empirical fields presupposes conceptual clarity on both sides of the fence. This becomes relevant insofar as Fenton and Lagnado offer the diagnosis of ‘fallacies of probabilistic reasoning […] in legal proceedings’ being ‘a sad indictment of the lack of impact made by statisticians in general (and Bayesians in particular) on legal practitioners’ (p. 282 – emphasis added). Whilst it is accurate that judges and laypeople find conditional probabilities challenging, Fenton and Lagnado’s diagnosis is partisan. Just when I was writing these lines the forensic scientist in R v Dunster ( EWCA Crim 1555 at [86-87]) committed the infamous prosecutor’s fallacy. The forensic scientist is reported to have had two propositions to consider:
- The DNA has originated from [MD] and two unknown individuals; or
- The DNA has originated from three unknown individuals.
The scientist concluded
that the first proposition [involving MD] is one billion times more likely than the second.
Equally ‘sad’ – to use Fenton and Lagnado’s emotive language – is therefore the fact that in Dunster as in a myriad of other cases the expert witness who is supposed to be trained in statistics not only inversed the conditional probability (which is known as prosecutor’s fallacy) but also addressed the ultimate issue (I call that the expert witness’s fallacy).
What is left at the end, is a rather peculiar approach to the law of evidence which does not clarify or illuminate conceptual structures but projects theoretical (normative) propositions on its very object of inquiry. This can be only described as the Procrustes’ approach. Consider for example Dahlman and Kolflaath’s ‘concluding remark’ that the ‘problem of the prior [probability of guilt] in criminal trials can be solved in different ways, and no solution is free from objections’ insofar, as ‘[e]very solution compromises some fundamental value of criminal procedure’ (p. 299 – emphasis added). Given the unhappy end necessitating a mutilated criminal procedure, the question is what kind of solution are we then talking about? The problem at this point is not the undecidability of the ‘prior probability of guilt’ but the fundamental insight that the ascription of guilt is not an empirical proposition (to be accepted in case of a sufficiently high (posterior) probability). The discussion about the allegedly correct ‘prior probability of guilt’ falls prey to a categorical error and neglects the ‘Is’ and ‘Ought’ distinction.
Whereas so far in PFEL we saw the superimposition of theoretical propositions from empirical disciplines or from analytic ones onto the fabric of the law of evidence, it is the Is-Ought distinction itself that is being put into question in the last part of the book. In chapter 24, a ‘feminist approach’ to the law of evidence in general and to the concept of relevance in particular is propagated. Simon-Kerr (pp. 364-379) claims among other things that
the English common law […] placed white women in a private domestic sphere with few independent rights to speak in courts of law. Black women’s voices were even more attenuated in the courts (p. 364).
At the same time, she contends, evidence theory has played a role in being ‘explicitly biased against women’ aggravating the impact of ‘discriminatory laws’. A good example for this, Simon-Kerr notes, is the ‘systemic disbelief of women who are victims [sic] of sexual assault or domestic violence’ (p. 365). Of course, calling someone a ‘victim’ even before and regardless of the criminal verdict, begs the question of ascribing guilt and pre-empts the outcome of the criminal process which needs to be fair, just, and in accordance with the defendant’s presumption of innocence. The main question however is how the challenges that women have historically faced, relates to the formal (in a non-mathematical way: probabilistic) conceptual framework which admits only relevant information into the criminal trial. But Simon-Kerr disputes that. A central claim of ‘feminist accounts of evidence’, she informs us, is that ‘this system of procedural rules is neither neutral nor value-free’ (p. 365). She quotes a ‘postmodernist feminist theorist’ who claims that
traditional evidence scholarship and legal positivism more generally, with their focus on the importance of logic, have masked the “inherently political and partial nature of law and facts.” (p. 367).
Although it is tempting to hear that traditional evidence scholarship is nested within the tradition of legal positivism (this is not true, I’m afraid) and although it is, frankly, surprising to hear that ‘facts’ have an ‘inherently political and partial nature’ (isn’t this a textbook definition of a categorical error?), it is to the alleged political nature of the law of evidence which is allegedly masked by legal positivism that I shall now turn. Students of jurisprudence learn that it only takes knowledge of the English language and grammar to understand inter alia what ‘purity’ refers to, when we are discussing the ‘Pure Theory of Law’. By linguistic convention it is the theory (which systematises currently valid law) that needs to be ‘pure’, i.e. scientific – not the law itself. No legal positivist who is worth his or her salt has ever disputed that the law embeds political and moral values. The law –in its most simplified form– is rather a set of political and moral values enshrined in normative propositions. But it is the task of legal dogmatics to systematise and concretise (morally and politically loaded) abstract legal rules and achieve (procedural) justice by treating similar cases similarly and different cases differently – whatever the said values are. As soon as a legal order comes to the realisation that similar cases are by law dealt with differently, the law is sooner or later amended. It is not clear why Simon-Kerr grapples with the old defence of provocation which (e.g. in England and Wales but also elsewhere) required the loss of control to be ‘sudden’, see s. 54(2) of the Coroners and Justice Act 2009. Although substantive criminal law and the defence of provocation did not (and arguably could not) take into account that in the context of domestic abuse the act of killing the abuser is often (especially when the killer is a woman) the result of a slow-burn reaction rather than an immediate loss of self-control, this has nothing to do with the evidential mechanism let alone with the concept of relevance for proving the elements of the defence. Criminal law can be severely flawed from a sociological or political point of view; fact-finders can also have strange ideas about society in general. Either way the procedural device of relevance is not to blame for any of that. Garbage in, garbage out.
Even more worrying is Simon-Kerr’s call to swap ‘rationalist approaches to knowledge’ (p. 367) in general and the law of evidence in particular with ‘alternative ways of knowing’ (p. 366). She exhorts the wholesale replacement of ‘the experiences of white men who acted as judges and who comprised the vast majority of jurors’ with ‘Black feminist epistemology’ and ‘alternative ways of validating truth’ (366). It is also not clear in what relation sex and race (assuming despite overwhelming evidence to the contrary that black and white people are forming homogeneous groups) stand or why certain races are written in uppercase letters (‘Black’) while others are not (‘white’).
The fact remains that an understanding of the law of evidence including the latter’s philosophical foundations is not Simon-Kerr’s objective. On the contrary, she declares that ‘[l]aw reform has always been at the core of feminist work to achieve more practical justice’ (p. 371). Again, the problem is not just that ‘justice’ will be assessed against the political demands of a certain reductive ideology. The deeper and alarming problem, as Susan Haack (whose absence from PFEL indeed from any edited volume on the philosophical foundations of evidence law cannot go unnoticed) points out, is that feminist jurisprudence engineers an anything-goes ‘feminist methodology muddle’ resulting in “alternative ways of knowing” although we have not heard why alternative knowledge would be valid, functional or even relevant. Approaches similar to Simon-Kerr’s chapter result in an unwarranted and unhelpful ‘politicization of inquiry, the deliberate blurring of the line between honest investigation and disguised advocacy; which both corrupts inquiry […] and leaves advocacy without the firm factual basis it needs’ (Susan Haack, The “Feminist Methodology” Muddle. In: 4 Gavagai 2018, pp. 184-187 (185)). To slightly paraphrase Solzhenitsyn, an ideologically loaded (Neo-Marxist) and reductionist tool like sex or race is as crude an attempt to explain the law of evidence ‘as if a surgeon were to perform his delicate operations with a meat ax’ (Aleksandr Solzhenitsyn, Address delivered in Washington, D.C. on June 30, 1975).
Conceptualising ‘feminist relevance’ is the equivalent of a discussion about the value of Bayesian socialism. Just the juxtaposition of these words reveals a categorical error. At the heart of this type of project lies the effort to attack the rationalist (philosophical) foundations of evidence law which are simultaneously the pillars of the Enlightenment project: logic, objectivity, analytic approach, truth. Those among us who cherish those values and do not mix analysis with personal views about politics, are criticised by the father of postmodernism, Jacques Derrida, as ‘dangerous dogmatists’ and apologists of the ‘good old Aufklärung’ (Jacques Derrida, Limited Inc, Evanston 1988, p. 119). In the field of postmodern studies where it has become increasingly difficult to detect what is sincere and what is satire disguised as hoax article, the proponents of ‘feminist relevance’ are more than free to swap the power of logic with the logic of power. As a project it remains, ironically, irrelevant for the understanding of relevance, i.e. one of the fundamental conditions for the admissibility of evidence.
The plethora of chapters can be described as a 26-course dinner menu. Although the chapters are in my opinion of varying quality, the main problem is structural. PFEL offers 26 main dishes dealing with complex issues, often making normative (or even false descriptive) claims about the law of evidence, alas confined in the wordcount-cage of the equivalent of an amuse-bouche, which is a combination of the worst of both worlds (e.g. Schauer is at pains to inoculate his chapter against this line of criticism by stressing that he can only offer a ‘brief sketch’ which is ‘somewhat of a caricature of the free proof idea’, p. 70).
The word philosophy ‘must mean something whose place is above or below’ the fields of science. It cannot be ‘beside them’, says Wittgenstein (Tractatus Logico-Philosophicus, 4.111). In order to get at the philosophical foundations underpinning the law of evidence, and to elucidate the latter’s nomodynamics, we need to take a long hard look at the edifice of the law of evidence as it is, not as it should be (in a philosopher’s or political activist’s opinion). Philosophy can help us humdrum evidence lawyers to articulate what our complex system of rules and principles does by clarifying concepts and structures. Offering an à la carte menu by superimposing theoretical objects on the fabric of evidence law is in effect advancing a form of an imperialist style of philosophy whose taste is rather bitter – especially when the mignardise being served is political activism and reductive ideology (be it about race or gender) disguised as scholarship.
‘Don’t think, but look!’. That was the advice given by Wittgenstein to his fellow philosophers. Having read PFEL carefully, I think that I can hazard a guess on what type of approach Wittgenstein was referring to.
Kyriakos N. Kotsoglou, Northumbria University, Newcastle upon Tyne, U.K.