Thursday, September 23, 2010

Behavioral Biometrics or Public Lie Detectors?

CIO.com
The linked article is confusing and heartening at the same time.

It is easily divided into two parts: a discussion of the efforts of some in the research community to bring lie detectors out of interrogation rooms and into contact with the public, and a brief summary of findings that public views on biometric identity management techniques differ from the way so-called privacy advocates frame the issues.

The reason that it is confusing is that the two parts of the article don't belong together.

The discussion of the Future Attribute Screening Technology (FAST) prototype has virtually nothing to do with public acceptance of biometric identity management techniques.

The techniques described under the label Behavioral Biometrics are akin to lie detector tests. They rely upon the detection of changes in bodily function resulting from some outside stimulus such as interrogation and seek to determine intent. Moreover, FAST attempts to automate this analysis as much as possible. This is like going from "Lie To Me" to "Minority Report".

["Lie to Me" is a current* TV show chronicling the adventures of one Dr. Lightman (Tim Roth), the world's greatest human lie detector. "Minority Report" takes place in a dystopian future where criminals are caught before crimes are committed.]

There's ample evidence that the "Lie To Me" scenario is at least reasonable. It is possible to train professional interviewers that can ferret out lies and attempts to deceive with some high degree of accuracy. In fact, these professionals are actively doing the job FAST attempts to automate every day in our airports and police stations.

While I am unqualified to make assertions of fact regarding the feasibility of developing a machine that functions with high reliability along the lines envisioned by FAST's creators, it is my guess that FAST, or a similar system, is not only possible in theory but highly likely to exist in reality in the not-too-distant future so long as current growth rates in human scientific knowledge and computational power continue.

But even if we accept that a FAST-like system will be technically possible in the future, say fifteen years from now, no current researcher could possibly say anything useful about whether or not the culture fifteen years from now would find it acceptable to use such tools in public places upon ordinary people without probable cause. Predictions about social views on technology fifteen years in the future more appropriately belong to the genre of science fiction than opinion polling.

None of this is to discredit University of Pittsburgh Dr. Lisa Nelson's
study of biometrics and the public views about it [that] reveals tolerance and support when it comes to government use of biometrics to protect public safety.

Although privacy advocacy groups are supposed to represent the public, Nelson said her studies based on focus groups show that "there are differences between public perception and how privacy advocates were framing the issues," with the larger public apparently far more willing than privacy-advocacy groups to accept biometrics when it's used for purposes of protecting against terrorism or identity theft.
This summary of Dr. Nelson's findings rings true. There does seem to be a significant disconnect between self-appointed privacy advocates and the public they claim to represent where issues of biometric identity management are concerned. But this has little bearing on FAST and other far-off technologies. To tie Dr. Nelson's findings to FAST does a disservice to Dr. Nelson and perhaps even misrepresents the views of FAST's creators.

*UPDATE: The show has since been canceled.