New Scientist has aÂ storyÂ about in-flight video surveillance. The goal? Terrorism detection, of course:
The European Union’s Security of Aircraft in the Future European Environment (SAFEE) project uses a camera in every passenger’s seat, with six wide-angle cameras to survey the aisles. Software then analyses the footage to detect developing terrorist activity or â€˜air-rageâ€™ incidents, by tracking passengers’ facial expressions.
It looks for running in the cabin, standing near the cockpit for long periods of time, and other predetermined indicators that suggest a developing threat,” saysÂ James FerrymanÂ of the University of Reading, UK, one of the system’s developers.
Other behaviours could include a person nervously touching their face, or sweating excessively. One such behaviour won’t trigger the system to alert the crew, only certain combinations of them.
Camera surveillance systems enhanced with facial-recognition technology have been trumpeted many times as solutions for how to â€œdetect intentâ€ of individuals.Â But they havenâ€™t worked. In December, I wrote an op-ed (pdf) inÂ The Tennessean about face recognition systems being used in schools. I explained that the algorithms to automatically detect and identify individuals were complex and still error-prone. I wrote:
Face-recognition systems have failed numerous real-world tests at airports in Dallas-Fort Worth, Fresno, and Palm Beach County, Calif. One glaring example of the technology’s weakness occurred when two people swapped passports at an Australian airport as a joke, and facial recognition systems didn’t catch their deception. The city of Tampa, Fla., stopped using its face-recognition system because of its failures. “It’s just proven not to have any benefit to us,” a police spokesman told The Tampa Tribune.
From October 2006 to January 2007, Germany conducted a large-scale, scientific study (pdf) on how well face-recognition technology actually works in picking a face out of a crowd. The four-month experiment was conducted at a German train station that has about 23,000 daily passengers. During the day, the study found the error rate was about 40 percent. At night, the error rate was 80 to 90 percent.
New ScientistÂ reports, â€œFerryman admits that his system will require thousands of tests on everyday passengers before it can be declared reliable at detecting threats.â€
Itâ€™s clear that face-recognition technology has a long way to go before it will be useful in detecting individual faces in crowds. As for automatically detecting behavioral patterns that are â€œsuspicious,â€ Iâ€™m skeptical that such technology will be reliable in the near future.