Monday, March 18, 2013

Face rec false rejects, organizational false accepts and ROI

Britain's passport and ID service seeks facial recog tech suppliers (The Register)
The Home Office plans to spend up to £16m on facial recognition technology for the Identity and Passport Service.

A tender notice in the European Union's Official Journal (OJEU) popped up this week that showed that Theresa May's department was now on the hunt for providers of a Facial Recognition Engine and a Facial Recognition Workflow for the IPS.
The article then proceeds to a brief discussion of the pros and cons of the tender. The pros follow the benefits of a facial database search before issuing new photo ID documents (click for a good example). In this case the ID documents are British passports. The cons presented in the article come in two flavors, price and performance.

The money issues are common to any governmental expenditure.

The performance issue in the article that I want to address is "false reject rate." The false reject rate of a facial recognition system in the case at hand should be taken apart and put into two categories. The first category is the performance of the core face-matching technology, the second category is the performance of the entire Home Office organization.

What constitutes a "false reject" in the core technological sense is any "match" made by the face recognition system between a submitted image and the images in the searchable database that turns out to be an incorrect/inaccurate match. In other words, "matches" that aren't real matches are false rejects.

But in this case, the Home Office is ultimately judged, by how many bad passports it issues (false accept), not by the perfection of one mechanism in a rigorous process by which the organization arrives at its go/no-go decision. After all, if my name is John Smith and I submit my passport application to the Home Office, they will probably search their databases for "John Smith." If they find several, does that constitute an automatic false reject? Does that mean I can't get a passport? Of course not. Someone will look at the list of John Smith's to see if I'm pretending to be someone else with the same name.

Here, facial recognition is used to add an image capability to go along with the search the Home Office already does with new passport applications. It is not an automated decision-making engine. Even though facial recognition systems at very large scales or in chaotic environments are very difficult to automate, they can be extremely useful investigative tools for trained users.

Humans are pretty good at matching faces with small data sets. The processes people use to identify other people with high confidence levels are extremely complex and may take into account all sorts of information that facial recognition software doesn't. People, however, aren't very good at identity management among large numbers of people they don't know.

In biometrics, the software takes in a mere fraction of the information people use. It doesn't make any inference about it, and it does its job extremely quickly by treating the problem in a way that closely resembles Nikola Tesla's famous critique of Thomas Edison: “If Edison had a needle to find in a haystack, he would proceed at once with the diligence of the bee to examine straw after straw until he found the object of his search.”

When dealing with people we don't know, humans are relegated to the needle-in-the-haystack process and unfortunately, they do it so slowly as to make it impractical with large data sets. Even if you believe that computers running facial recognition software aren't very good at recognizing people, they're way better at dealing with the problem of large populations than people are.

The assumption buried in the "false reject" critique for this face-rec application is that narrowing a list of 300,000 down to ten possible matches represents 9 failures. More accurately, because pre-face-rec no image-based comparison is being conducted at all, it represents 299,991 successes

When biometric software is used to sort a large population according to the probability of a match, then to present the list of top candidates to a person trained to detect fraudulent passport applications, the result is a fraud-detecting capability that did not exist before. So, even though facial recognition software by itself may have a "false reject" rate, it does not operate in a vacuum and will almost certainly help the organization as a whole reduce the inappropriate issuance of passports, i.e. its "false accept" rate

So we finally arrive where we should have been attempting to go all along — Return on Investment (ROI). ROI can be hard to calculate in security applications. It can also be hard to calculate for government expenditures, but ROI is where the rubber meets the road. The proposition does not turn on whether facial recognition can dictate to human beings whether or not to issue a passport. It can't, and even if it could, most people would probably be uncomfortable giving up their right to appeal to a person in a decision-making capacity. Facial recognition can certainly help people make better decisions, though, and biometrics and ID are ultimately all about people.