The idea of secret surveillance from a distance isn’t new. For centuries, there have been undercover agents. Subsequently, there came hidden cameras and microphones. But there were limitations to this secret surveillance — such as the physical constraints of a human or camera located far from the person being watched. As surveillance technology has become more sophisticated, however, it is becoming easier to identify, watch, listen to, and judge people from a distance.
The judgment portion is, in part, based on biometric facial-recognition technology that incorporates expression recognition. For the unseen eyes, it’s no longer just about identifying a person, but also about watching their emotional responses. This type of facial-recognition tech gained attention a few years ago when Microsoft filed a patent for technology that would track individuals’ emotions and target advertising and marketing as based upon a person’s mood.
“Degrees of emotion can vary — a user can be ‘very angry’ or ‘slightly angry’ — as well as the duration of the mood. Advertisers can target people ‘happy for one hour’ or ‘happy for 24 hours,’” the Toronto Star reported in 2012. Four years later, the mood-identification technology can be bought off the shelf, as NBC News explains in a story about “a new immersive experience for moviegoers.”
“Computer vision can now recognize an object very clearly,” said Dr Hongying Meng, a professor in Brunel’s Department of Electronic and Computer Engineering who oversaw the “RIOT” project. “It can know that it’s a face. Then, there’s facial detection, knowing who it is. The next step is knowing what the feeling is, that’s facial expression.”
As always, there are attempts to protect individual privacy through technology. Recently, researchers from Carnegie Mellon University and the University of North Carolina developed “a systematic method to automatically generate” attacks on facial-recognition biometric systems. These attacks are “attacks that are physically realizable and inconspicuous.”
We detail the design of eyeglass frames that, when printed and worn, permitted three subjects (specifically, the first three authors) to succeed at least 80% of the time when attempting dodging against state-of-the-art [face-recognition systems]. Other versions of eyeglass frames allowed subjects to impersonate randomly chosen targets.
This research and other projects seem to have a strong chance of confounding some facial-recognition systems. However, technology is constantly changing, and surveillance technology can and has evolved to resist efforts to evade it. To protect individual privacy, we should not look only to technological blocks. Rather, we should focus on legal protections. In some states, there are forms of legal protection available.
Texas prohibits the collection of “a biometric identifier of an individual for a commercial purpose unless the person: (1) informs the individual before capturing the biometric identifier; and (2) receives the individual’s consent to capture the biometric identifier.” The state also regulates the use, distribution, storage and destruction of such biometric data (Bus. & Com. Code Ann. § 503.001).
Illinois has its Biometric Information Privacy Act (740 ILCS 14), which includes a provision that: “No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person’s or a customer’s biometric identifier or biometric information, unless it first” gives written notice of such collection (including “specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used”) and gets written consent from the individual.
But there is no federal law that protects a person from secret facial-recognition (and other biometric) surveillance. Innocent individuals should be protected from such surveillance — human or technological — unless legal standards, such as a warrant, are met. This would ensure law enforcement is able to investigate crimes and protect the public without trampling on the constitutional rights of individuals. And we need to also consider protections so that marketers cannot surveil us in this manner, as well.