You walk into your shower and find a spider. You are not an arachnologist. You do, however, know that one of the following options is possible:
The spider is real and harmless. The spider is real and venomous.
Your next-door neighbor, who dislikes your noisy dog, has turned her personal surveillance spider (purchased from “Drones ‘R Us” for $49.95) loose and is monitoring it on her iPhone from her seat at a sports bar downtown. The pictures of you, undressed, are now being relayed on several screens during the break of an NFL game, to the mirth of the entire neighborhood.
Your business competitor has sent his drone assassin spider, which he purchased from a bankrupt military contractor, to take you out. Upon spotting you with its sensors, and before you have any time to weigh your options, the spider shoots an infinitesimal needle into a vein in your left leg and takes a blood sample.
As you beat a retreat out of the shower, your blood sample is being run on your competitor’s smartphone for a DNA match. The match is made against a DNA sample of you that is already on file at EVER.com (Everything about Everybody), an international DNA database (with access available for $179.99).
Once the match is confirmed (a matter of seconds), the assassin spider outruns you with incredible speed into your bedroom, pausing only long enough to dart another needle, this time containing a lethal dose of a synthetically produced, undetectable poison, into your bloodstream. Your assassin, who is on a summer vacation in Provence, then withdraws his spider under the crack of your bedroom door and out of the house, and presses its self-destruct button. No trace of the spider or the poison it carried will ever be found by law enforcement authorities...
Showing posts with label future. Show all posts
Showing posts with label future. Show all posts
Monday, August 20, 2012
Technology & the Future of Violence
Not really biometrics related... but that's pretty much the point. (Hoover.org)
Thursday, June 28, 2012
First-hand account of a "Vacation in Utopia"
A delightful vignette of a possible biometric future written by Dr. Ben Ajayi...
You are unique and wonderfully made (Nigerian Tribune)
h/t @m2sys
You are unique and wonderfully made (Nigerian Tribune)
The billboard was not done with me as it flashed another page for me to read, “Dr. Ben, you are unique and wonderfully made. Of all persons alive today and even those who have lived before you since the beginning of time not one is like you. Your voice is special; no other fingerprints are like yours; no-one looks like you; speaks exactly like you; laughs like you; walks like you with your exact weight, height and mannerism. We use all these facts in our data base to identify you. Do please feel free to enjoy our country.”
h/t @m2sys
Tuesday, November 8, 2011
Dishonesty detectors: Flawed technology?
We've been generally skeptical of applied behavioral biometrics (and biostatistics) in security applications. The author of the linked article, in the quoted text below nails the reason we're unlikely to see these technologies deployed for a very long time. It's a variation on the Return on Investment argument for adopting a given security solution.
Who knows what evil lurks in the hearts of men? (Smart Planet)
In order to see how, let's imagine a system integrator's dream deployment and then see how that environment differs from an airport.
A system like this would detect all sorts of biostatistics and then compare them to some "normal" value, allow for a tolerance and then alert administrators if something is out of a certain range. If someone wanted to deploy a system like this and give it the best possible chance of success, it would make sense to seek out an environment where "normal" is a very narrow range, rather than a very wide range. The test designer would naturally gravitate toward a test environment where the test subjects make up a homogeneous group, a place where there is cultural uniformity, narrow age differences, low novelty, etc. If I'm the tester, I'm thinking prison first, then military base.
Now airports, by their very nature serve people of all ages from all over the world in various mental, physical and emotional states, not smooth sailing for testing sensitive equipment or training TSA staff to make judgments on small fluctuations of observed data. In airport use, either the error rates have to be very small, or the biostatistic examination would serve as only a small factor in security decision making.
Removing our benefit of the doubt by hypothesizing that those most likely to want to bring harm to global commerce and air travel might undertake training to control their biostatistics and subvert the security they afford, I'm guessing that airports will be one of the last places to adopt such a system. It'll be too costly in all sorts of ways for too little return.
See also:
Security: Biometrics vs. Biostatistics (Sept. 15, 2011)
Behavioral Biometrics or Public Lie Detectors? (Sept. 23, 2010)
Mal-intent may be the future of security (June 1, 2010)
Who knows what evil lurks in the hearts of men? (Smart Planet)
Even if we put aside reservations about self-reported scores on trials under unspecified conditions and grant that FAST is a technology in its infancy, that track record doesn’t inspire confidence. No one should be satisfied with a screening method that lets through more than one out of every five would-be plane bombers. Far more annoying, however, is that we don’t know exactly what the rates of false positives and false negatives were. A system that missed 20 percent of the terrorists in an airport would be bad but terrorists are rare, so disastrous mistakes would be few. But a system that snared 20 percent of innocent travelers as terrorist suspects would destroy air travel overnight.Even while extending the author's benefit of the doubt, for airports especially the (negative) return on investment would be crippling.
In order to see how, let's imagine a system integrator's dream deployment and then see how that environment differs from an airport.
A system like this would detect all sorts of biostatistics and then compare them to some "normal" value, allow for a tolerance and then alert administrators if something is out of a certain range. If someone wanted to deploy a system like this and give it the best possible chance of success, it would make sense to seek out an environment where "normal" is a very narrow range, rather than a very wide range. The test designer would naturally gravitate toward a test environment where the test subjects make up a homogeneous group, a place where there is cultural uniformity, narrow age differences, low novelty, etc. If I'm the tester, I'm thinking prison first, then military base.
Now airports, by their very nature serve people of all ages from all over the world in various mental, physical and emotional states, not smooth sailing for testing sensitive equipment or training TSA staff to make judgments on small fluctuations of observed data. In airport use, either the error rates have to be very small, or the biostatistic examination would serve as only a small factor in security decision making.
Removing our benefit of the doubt by hypothesizing that those most likely to want to bring harm to global commerce and air travel might undertake training to control their biostatistics and subvert the security they afford, I'm guessing that airports will be one of the last places to adopt such a system. It'll be too costly in all sorts of ways for too little return.
See also:
Security: Biometrics vs. Biostatistics (Sept. 15, 2011)
Behavioral Biometrics or Public Lie Detectors? (Sept. 23, 2010)
Mal-intent may be the future of security (June 1, 2010)
Thursday, December 23, 2010
Imagine a front desk without people
For hotels, it cuts costs (Chicago Tribune)
The return on investment (ROI) is significant.
Consider this: You go to the front desk to check in, your image is captured on computer, you're given your room number but no key card. You get to your room and a facial-recognition reader opens the door for you. Or maybe you just check in at a lobby kiosk and bypass the front desk altogether. Or you use your credit card to enter your room.Biometrics figure to be the linchpin in a host of revolutionary and cost saving technologies and processes.
The return on investment (ROI) is significant.
Thursday, May 20, 2010
Google debates face recognition technology
FT.com
These tools are coming. They will bring huge productivity gains. They will be abused.
Those concerned about the effects of technologies like Google Goggles upon their family's privacy would be well advised to think about what information about them exists online and what they do now to manage who has access to it. Most of us have near-total control over what personal information ends up on the internet. If the only thing keeping online information about you "private" is the lack of better search engines, then it might be a good idea to reevaluate how much personal information you post/allow to be posted online.
It is possible, even likely, that the internet will become both more private and less private. More private as Google increasingly respects the interests of content owners (FT again). Less private as better search brings more of the internet to users attention.
*The technical challenges of using a picture of a person's face as the only search term for a search of the internet for facts about that person are extremely daunting. If you take a picture of an apple, presumably the search would return lots of pictures and information relating to apples. If you take a picture of a person, presumably the search would return lots of pictures and information relating to people. Converting an object recognition search that has yet to be deployed into an facial recognition search is a long way off.
Mr Schmidt said: “Facial recognition is a good example . . . anything we did in that area would be highly, highly planned, discussed and reviewed. When you go through these things, you review your management procedures.”Apart from Google's history with privacy issues, they do face a dilemma, as the article points out. There is no reason to limit search terms to text only. Some innovator will bring search into the visual arena and enable a picture to be used as the search term. Without new regulation, that means that at some indeterminate point in the future* someone could take a picture of a face with a cell phone and find out a lot about that person.
These tools are coming. They will bring huge productivity gains. They will be abused.
Those concerned about the effects of technologies like Google Goggles upon their family's privacy would be well advised to think about what information about them exists online and what they do now to manage who has access to it. Most of us have near-total control over what personal information ends up on the internet. If the only thing keeping online information about you "private" is the lack of better search engines, then it might be a good idea to reevaluate how much personal information you post/allow to be posted online.
It is possible, even likely, that the internet will become both more private and less private. More private as Google increasingly respects the interests of content owners (FT again). Less private as better search brings more of the internet to users attention.
*The technical challenges of using a picture of a person's face as the only search term for a search of the internet for facts about that person are extremely daunting. If you take a picture of an apple, presumably the search would return lots of pictures and information relating to apples. If you take a picture of a person, presumably the search would return lots of pictures and information relating to people. Converting an object recognition search that has yet to be deployed into an facial recognition search is a long way off.
Subscribe to:
Posts (Atom)