Monday, March 5, 2012

New Statistical Model Assigns Probability to Fingerprint Evidence

Statistical model removes barriers to using fingerprint evidence in court (Homeland Security NewsWire)
Potentially important fingerprint evidence is currently not being considered in legal proceedings owing to shortcomings in the way it is reported, according to a report published Wednesday in Significance, the magazine of the Royal Statistical Society and the American Statistical Association. Researchers involved in the study have devised a statistical model to enable the weight of fingerprint evidence to be quantified, paving the way for its full inclusion in the criminal identification process.

A Wiley release reports that fingerprints have been used for over a century as a way of identifying criminals. Fingerprint evidence, however, is not currently permitted to be reported in court unless examiners claim absolute certainty that a mark has been left by a particular suspect. This courtroom certainty is based purely on categorical personal opinion, formed through years of training and experience, but not on logic or scientific data. Less-than-certain fingerprint evidence is not reported at all, irrespective of the potential weight and relevance of this evidence in a case.
It may come as a surprise that fingerprint evidence in court cases depends upon expert witness testimony. It is only admitted if an expert claims absolute certainty of a match.

The shortcomings (error rates) of the current system are well described by Cognitive Consultants International (CCI) in their study of actual professional examiners [pdf]. Since the evidence is collected from the chaotic environment of a crime scene and frequently consists of partial fingerprints, a heavy burden falls upon professional examiners and the methods used by examiners open the door to errors related to the way humans process information. The team found statistically significant unevenness among examiners and even within the same examiner.

Abstract:
Deciding whether two fingerprint marks originate from the same source requires examination and comparison of their features. Many cognitive factors play a major role in such information processing. In this paper we examined the consistency (both between- and within-experts) in the analysis of latent marks, and whether the presence of a ‘target’ comparison print affects this analysis. Our findings showed that the context of a comparison print affected analysis of the latent mark, possibly influencing allocation of attention, visual search, and threshold for determining a ‘signal’. We also found that even without the context of the comparison print there was still a lack of consistency in analysing latent marks. Not only was this reflected by inconsistency between different experts, but the same experts at different times were inconsistent with their own analysis. However, the characterization of these inconsistencies depends on the standard and definition of what constitutes inconsistent. Furthermore, these effects were not uniform; the lack of consistency varied across fingerprints and experts. We propose solutions to mediate variability in the analysis of friction ridge skin.
Cognitive Solutions has quantified the error rates of the current system and have made proposals to reduce those error rates. They propose a reassessment of how examiners are recruited and trained; And since different types of latent print lead to different error rates, they recommend further research into the categorization of latent fingerprints.

Alternatively, in Fingerprints at the crime-scene: Statistically certain, or probable? [pdf], Cedric Neumann and Julian Champkin propose a statistical error-checking method applied to the minutiae used by examiners in order to generate a probability score for the match, arguing that "DNA experts are required to give probabilities for their evidence of matching; fingerprint expert are forbidden to. This bizarre situation ought to be ended, in the interests of justice as well as of common sense." This is how they do it:

Figure 2, slightly edited, from Significance. Fingerprints at the Crime Scene.



Historically, in most countries, 12 minutiae that matched each other in type, orientation and position have generally been considered sufficient to identify the source of the mark. Until 2001 the UK required 16 correspondences to establish proof of identity. Both these numbers arose through experience rather than statistical analysis.

The reasoning that currently leads experts from minutiae to identification is essentially a psychological one that cannot be rationalized and rendered explicit. The method that my colleagues and I have presented also relies on those minutiae; but numbers are derived from them.

On any given finger impression, the most prominent minutiae – say six – can be selected and joined up, in a clockwise direction (see Figure 2). They will form a pattern – essentially a six-sided polygon around a centre. (The centre can be defined as the arithmetic mean of the Cartesian co-ordinates of our six points.) A polygon is a much simpler pattern than the whirling lines of a full print or mark. It is also much easier to analyse numerically. The basis of the method is to describe that polygon with a set of variables.


h/t @MDKConsulting