What’s “digital signage”? Most people have heard of the term connected with billboards or other screens that have cameras (and facial-recognition technology) to watch people watching ads in order to improve their marketing. The digital signs log data such as gender, approximate age and how long someone looks at an advertisement. This is supposed to help build a better billboard — one that is tailored specifically to the individual standing in front of it. However, the data-gathering and surveillance practices raise substantial privacy questions.
One example of digital signage advertising is the Castrol project in the UK. The oil company bought the car registration data of millions of motorists. Then, the company set up giant digital billboards that scanned UK motorists’ license plates, ran the plates through a database and instantly displayed on the billboards what the best oil would be for that specific driver’s car. The license-plate scanning technology allowed each car to be read as if it were tagged with data, and that tagged data interacted with the intelligent computing technology in the billboard, allowing for advertising targeted to that specific driver. There was a public uproar, and the company quickly ended the project.
The advertising industry is aware of the significant privacy questions raised by the use of digital signage. POPAI (a digital signage industry association) released “Best Practices: Recommended Code of Conduct for Consumer Tracking Research,” but these best practices are not enough. Privacy and consumer groups have released two new sets of privacy protection frameworks for the digital signage industry.
First, the Center for Democracy and Technology has released a set of privacy guidelines, which I consulted on and contributed to, in the report “Building the Digital Out-Of-Home Privacy Infrastructure.” The guidelines reject the traditional delineation of “personally identifiable information” and “non-personally identifiable information.” “The distinction between PII and non-PII is becoming much less meaningful in light of data analytic capabilities. Researchers have demonstrated that individuals can still be identified from records stripped of traditional identifiers […] Therefore, the best approach for companies is to evaluate all the data they collect on a spectrum ranging from directly identifiable to ‘pseudonymous’ to aggregated, providing different levels of privacy protection corresponding to the sensitivity of the information involved.”
The privacy protection framework builds on the Fair Information Practices. For example:
2) Individual Participation
The FIPs principle of “individual participation” embodies two concepts: the right to consent to the collection and use of data and the right to access to data that has been collected about oneself. The robustness of the individual participation protocol required varies depending on the sensitivity and identifiability of the information collected and the use to which it is put. […]
Consumers should be able to exercise control over what information is collected, which marketing messages they receive, and which other companies and parties may see the data. The consent should be persistently honored until the consumer alters his or her choice, and the consent should also be revocable at any time. To the extent possible, opt-in consent protocol should be granular without also being confusing to consumers. One way to strike this balance is to offer various privacy control options, but to also offer an easy means to opt-out or opt-in to all the choices at once.
Consumers should have the ability to view and correct any directly identifiable data collected about them for DOOH marketing. Consumer confidence in an organization may be vastly improved if individuals have access to their own data, whereas consumers will perceive surveillance and data analysis behind closed doors as considerably more intrusive.
For more on the CDT report, head to the group’s blog, where Staff Counsel Harley Geiger details the analysis.
The second privacy protective framework from privacy and consumer advocacy groups was released last week at the Digital Signage Expo in Las Vegas, Nevada. World Privacy Forum Executive Director Pam Dixon spoke about the principles to a large group of digital signage industry professionals. Privacy Lives joins several consumer and privacy groups in noting these privacy principles for digital signage (pdf) are necessary because, “currently there is little if any disclosure to consumers that information about behavioral and personal characteristics is being collected and analyzed.”
The groups proposing the framework are: World Privacy Forum; Center for Digital Democracy; Consumer Action; Consumer Federation of America; Patient Privacy Rights; PrivacyActivism; Privacy Lives; and Privacy Rights Clearinghouse.
For more on digital signage:
- The World Privacy Forum issued a report on digital signage and privacy questions related to it last month.
- Here’s a previous post raising privacy questions connected with digital signage.