Search


  • Categories


  • Archives

    « Home

    EU Tests In-Flight Video Surveillance to Automatically ID Suspects

    New Scientist has a story about in-flight video surveillance. The goal? Terrorism detection, of course:

    The European Union’s Security of Aircraft in the Future European Environment (SAFEE) project uses a camera in every passenger’s seat, with six wide-angle cameras to survey the aisles. Software then analyses the footage to detect developing terrorist activity or ‘air-rage’ incidents, by tracking passengers’ facial expressions.

    […]

    It looks for running in the cabin, standing near the cockpit for long periods of time, and other predetermined indicators that suggest a developing threat,” says James Ferryman of the University of Reading, UK, one of the system’s developers.

    Other behaviours could include a person nervously touching their face, or sweating excessively. One such behaviour won’t trigger the system to alert the crew, only certain combinations of them.

    Camera surveillance systems enhanced with facial-recognition technology have been trumpeted many times as solutions for how to “detect intent” of individuals. But they haven’t worked. In December, I wrote an op-ed (pdf) in The Tennessean about face recognition systems being used in schools. I explained that the algorithms to automatically detect and identify individuals were complex and still error-prone. I wrote:

    Face-recognition systems have failed numerous real-world tests at airports in Dallas-Fort Worth, Fresno, and Palm Beach County, Calif. One glaring example of the technology’s weakness occurred when two people swapped passports at an Australian airport as a joke, and facial recognition systems didn’t catch their deception. The city of Tampa, Fla., stopped using its face-recognition system because of its failures. “It’s just proven not to have any benefit to us,” a police spokesman told The Tampa Tribune.

    From October 2006 to January 2007, Germany conducted a large-scale, scientific study (pdf) on how well face-recognition technology actually works in picking a face out of a crowd. The four-month experiment was conducted at a German train station that has about 23,000 daily passengers. During the day, the study found the error rate was about 40 percent. At night, the error rate was 80 to 90 percent.

    New Scientist reports, “Ferryman admits that his system will require thousands of tests on everyday passengers before it can be declared reliable at detecting threats.”

    It’s clear that face-recognition technology has a long way to go before it will be useful in detecting individual faces in crowds. As for automatically detecting behavioral patterns that are “suspicious,” I’m skeptical that such technology will be reliable in the near future.

    Leave a Reply