• Categories

  • Archives

    « Home

    Update on the Use of Facial-Recognition Technology on Consumers

    I’ve discussed before the increasing use of facial-recognition technology, especially in advertising in “digital signage.” Most people have heard of the term connected with billboards or other screens that have cameras (and facial-recognition technology) to watch people watching ads in order to improve their marketing. The digital signs log data such as gender, approximate age and how long someone looks at an advertisement. As identification technology becomes cheaper and more prevalent, it could easily unmask people and track their movements. Those who were previously part of the unnamed crowd could be singled out for identification through these digital advertisements. (See this previous post for a discussion about the First Amendment right to free speech and how widespread identification technologies can affect that. Also, I worked with the Center for Democracy and Technology on a set of privacy guidelines for the digital signage industry, “Building the Digital Out-Of-Home Privacy Infrastructure” (pdf).)

    In July, the Wall Street Journal looked at the use of facial-recognition systems to estimate individuals’ age and gender as they shop. The Washington Times reported on the use of billboards with facial-recognition technology to identify individuals for a variety of purposes. Last year, BusinessWeek discussed stores using camera-security systems to track consumers’ movements for marketing purposes.

    But the use of facial-recognition technology is expanding to mannequins inside stores, as well. Last year, the New York Times reported about stores hiding inside mannequins tiny cameras that can be used to secretly surveil and gather biometric data on individual shoppers. Now, Bloomberg News takes an in-depth look at the use of secret cameras in store mannequins to identify the age, gender and race of shoppers — and the privacy questions this identification can raise:

    Fashion brands are deploying mannequins equipped with technology used to identify criminals at airports to watch over shoppers in their stores. Retailers are introducing the EyeSee, sold by Italian mannequin maker Almax SpA, to glean data on customers much as online merchants are able to do.

    Five companies are using a total of “a few dozen” of the mannequins with orders for at least that many more, Almax Chief Executive Officer Max Catanese said. The 4,000-euro ($5,130) device has spurred shops to adjust window displays, store layouts and promotions to keep consumers walking in the door and spending. […]
    The EyeSee looks ordinary enough on the outside, with its slender polystyrene frame, blank face and improbable pose. Inside, it’s no dummy. A camera embedded in one eye feeds data into facial-recognition software like that used by police. It logs the age, gender, and race of passers-by. […]
    The mannequin, which went on sale last December and is now being used in three European countries and the U.S., has led one outlet to adjust its window displays after revealing that men who shopped in the first two days of a sale spent more than women, according to Almax. […]
    Nordstrom, a U.S. chain of more than 100 department stores, says facial-recognition software may go a step too far.

    “It’s a changing landscape but we’re always going to be sensitive about respecting the customer’s boundaries,” said spokesman Colin Johnson. […]

    Others say profiling customers raises legal and ethical issues. U.S. and European Union regulations permit the use of cameras for security purposes, though retailers need to put up signs in their stores warning customers they may be filmed. Watching people solely for commercial gain may break the rules and could be viewed as gathering personal data without consent, says Christopher Mesnooh, a partner at law firm Field Fisher Waterhouse in Paris.

    Leave a Reply