In a story about the Apple iPad (an electronic touch screen tablet that will be released next month), the Financial Times reveals something interesting about Apple and privacy: the company restricts the consumer data it shares with its partners. This has caused problems in the company’s negotiations for content with newspaper and magazine publishers, the Times reports. Content publishers and providers mine and track consumer data in order to target behavioral advertising.
Ownership of subscriber information and pricing have emerged as key issues.
Apple’s practice of sharing with its partners little consumer data beyond sales volume is a problem. “Is it a dealbreaker? It’s pretty damn close,” said one senior media executive of a US metropolitan daily newspaper.
Publishers have spent decades collecting information about subscribers that influence marketing plans and, in some cases, the content of the publication itself. Apple’s policy would separate them from their most valuable asset, publishing executives said.
Behavioral advertising is where a user’s online activity is tracked so that ads can be served based on the user’s behavior. Increasingly, electronic information from consumers is collected, compiled, and sold — this is all done without reasonable safeguards. Privacy Lives has urged federal agencies, lawmakers, and the advertising industry to strengthen protections of consumer privacy.
In January, Privacy Lives joined eight groups in submitting comments (pdf) to the Federal Communications Commission saying that “substantial threats to our privacy and related consumer protection issues” can arise from the business practices and policies of broadband, mobile and other advertising companies. The groups said that both sets of self-regulatory guidelines by the U.S. Interactive Advertising Bureau (“IAB”), the online marketing industry’s principal trade and lobbying group, [guidelines here (pdf)] and Network Advertising Initiative [guidelines here (pdf)] have narrow definitions for “sensitive information” and “personally identifiable information,” focusing on the traditional ideas of identification numbers or addresses.
But even the Federal Trade Commission has expanded its idea of “personally identifiable information.” “Indeed, in the context of online behavioral advertising, rapidly changing technologies and other factors have made the line between personally identifiable and non-personally identifiable information increasingly unclear,” FTC staff said in a report (pdf) last year. The consumer advocacy groups said, “Individuals should be protected even if the information collected about them in behavioral tracking cannot be linked to their names, addresses, or other traditional ‘personally identifiable information,’ as long as they can be distinguished as a particular computer user based on their profile.”
And last year, Privacy Lives joined nine groups in detailing recommended solutions for and informing the public and government officials of important gaps in consumer privacy protection, including targeted behavioral advertising. The groups noted that for the past four decades the foundation of U.S. privacy policies has been based on Fair Information Practices, especially as they are implemented in the OECD Guidelines on data privacy: collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. Congress has reaffirmed its commitment to the Fair Information Practices numerous times. Congress used the Fair Information Practices as the basis of the Privacy Act of 1974, which restricts the amount of personal data that Federal agencies can collect and requires agencies to be transparent in their information practices. When Congress created the Department of Homeland Security’s Privacy Office several years ago, Fair Information Practices were included in the establishing legislation. The groups called on Congress to apply those principles in legislation to protect consumer information and privacy.