June 4th, 2015
Update on June 7: There’s news that the Office of Personnel Management was hacked and the unencrypted personal data of 4.1 million current and former federal employees was accessed. It has been nine years since an unencrypted laptop and hard drive containing sensitive data on 26.5 million current military personnel, veterans, and their spouses were stolen from a Department of Veterans Affairs’ employee’s home. That security breach led to a push for the use of encryption throughout the federal government, and I hope this breach leads to stronger data protections.
For years, security and privacy professionals have been urging companies to encrypt their data so that when there are security breaches, there is less damage to individuals whose data is accessed. Yet we continue to read reports about companies failing to use this basic tool to secure information.
For example, California-based U.S. Healthworks recently revealed (pdf) that a password-protected yet unencrypted laptop was stolen from an employee’s vehicle. The health-care service provider told employees, “We determined that the laptop may have contained files that included your name, address, date of birth, job title, and Social Security number.”
Financial services company Sterne Agee and Leach was recently fined $225,000 and required to review its security protocols by the Financial Industry Regulatory Authority after a 2014 incident where a Sterne Agee employee lost an unencrypted laptop after leaving it in a restroom. The laptop included “clients’ account numbers, Social Security numbers and other personal information,” according to a news report. Read more »
May 20th, 2015
We’ve talked before about the various ways in which businesses have been tracking their employees. For a while, there was increasing focus on the practice by some employers of requiring job applicants or employees to hand over their passwords or allow access to their private accounts on social-networking sites in order to gather personal data when the social-networking profiles are closed to the public. States including California, Delaware, Illinois and Maryland passed laws to protect employees from such prying by employers; Maryland’s law includes exemptions for employers for some investigations into possible wrongdoing by employees.
Employers are also using key-logging technology to monitor workers’ keystrokes and Internet-tracking software to log the sites that employees visit. And business have also been tracking the movements of their workers. Read more »
April 22nd, 2015
I’ve written before about the increasing use of “digital signage.” What is “digital signage”? Most people have heard of the term connected with billboards or other screens that have cameras (and facial-recognition technology) to watch people watching ads in order to target advertising toward individuals. The data-gathering and surveillance practices raise substantial privacy questions.
The Los Angeles Times reported on the expansion of these digital billboards and their use of facial-recognition biometric technology in casinos, Chicago-area bars and more. USA Today and the New York Times have detailed safety problems that can arise from these digital billboards. BBC News has reported on the use of digital billboards in the United Kingdom. The Wall Street Journal has reported on digital signage use in Japan.
Now, Wired reports on the more widespread use of software from the artificial intelligence startup Affectiva that “will read your emotional reactions” in real time. “Already, CBS has used it to determine how new shows might go down with viewers. And during the 2012 Presidential election, [Affectiva's chief science officer Rana el Kaliouby’s] team experimented with using it to track a sample of voters during a debate. Read more »
March 26th, 2015
As the use of license-plate-recognition camera technology to gather and record drivers’ movements started becoming widespread in the United States, people asked a number of questions about the privacy, civil liberty and security implications about the surveillance technology. Last year, the Center for Investigative Reporting looked into privacy questions concerning the use of license-plate readers and found that “a leading maker of license-plate readers wants to merge the vehicle identification technology with other sources of identifying information.” A couple of years ago, the American Civil Liberties Union released a report (pdf) on license-plate readers and how they are used as surveillance devices.
And law enforcement is concerned about how such tech affects privacy rights, as well. In 2009, the International Association of Chiefs of Police issued a report on license-plate-recognition technology and said, “Recording driving habits could implicate First Amendment concerns. [...] Mobile LPR units could read and collect the license plate numbers of vehicles parked at addiction counseling meetings, doctors’ offices, health clinics, or even staging areas for political protests.” The privacy and civil liberty questions have led to the cancellation of some license-plate-recognition surveillance programs, including ones in Boston and by the Department of Homeland Security.
One of the biggest questions is: What happens to all the data on innocent individuals? Often, we don’t know what the restrictions are on the collection and use of the data. We have learned some information about what some groups do with the data. Last year, the Washington Post reported that commercial databases gather such location data to sell. In 2013, the ACLU review of license-plate-reader camera technology found that “the approach in Pittsburg, Calif., is typical: a police policy document there says that license plate readers can be used for ‘any routine patrol operation or criminal investigation,’ adding, ‘reasonable suspicion or probable cause is not required.’ [...] As New York’s Scarsdale Police Department put it in one document, the use of license plate readers ‘is only limited by the officer’s imagination.’” In 2011, the Washington Post reported that Virginia used the license-plate scanning technology for tax collection.
Now, as a result of the public records request, Ars Technica has received the entire license-plate-reader dataset of the Oakland Police Department, “including more than 4.6 million reads of over 1.1 million unique plates between December 23, 2010 and May 31, 2014.” And it’s interesting to see what personal information can be gleaned from the surveillance data.
Read more »
March 2nd, 2015
The Obama White House recently released its draft Consumer Privacy Bill of Rights Act (pdf) and a fact sheet. Parts of the draft legislation date to a 2012 white paper (pdf) that laid out a plan to better protect consumer privacy. And last year, the big data group that the White House convened also issued recommendations on privacy (pdf).
The White House has taken important steps in highlighting that individuals need strong privacy protections for their data and in creating the draft legislation. And it is important that the draft legislation attempts to implement the Fair Information Practices: collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. For example, the draft legislation gives several options for responding to companies that would violate the bill’s provisions, including allowing individuals and states attorneys general to file lawsuits.
But there are several significant problems with the proposal that need to be addressed before it can move forward. (The draft does not yet have a legislative sponsor, which it would need in order to be introduced and debated in Congress.)
One problem with the legislation: It would preempt state laws.
SEC. 401. Preemption.
(a) In General.—This Act preempts any provision of a statute, regulation, or rule of a State or local government, with respect to those entities covered pursuant to this Act, to the extent that the provision imposes requirements on covered entities with respect to personal data processing.
Read more »
February 6th, 2015
In a recent article for Science, researchers Yves-Alexandre de Montjoye, Laura Radaelli, Vivek Kumar Singh, and Alex “Sandy” Pentland showed that the “anonymization” of personal data is not a guarantee of privacy for individuals. Before we discuss their study, let’s consider that it has been almost two decades of researchers telling us that anonymization, or “de-identification,” of private information has significant problems, and individuals can be re-identified and have their privacy breached.
Latanya Sweeney has been researching the issue of de-anonymization or re-identification of data for years. (She has taught at Harvard and Carnegie Mellon and has been the chief technologist for the Federal Trade Commission.) In 1998, she explained how a former governor of Massachusetts had his full medical record re-identified by cross-referencing Census information with de-identified health data. Sweeney also found that, with birth date alone, 12 percent of a population of voters can be re-identified. With birth date and gender, that number increases to 29 percent, and with birth date and Zip code it increases to 69 percent. In 2000, Sweeney found that 87 percent of the U.S. population could be identified with birth date, gender and Zip code. She used 1990 Census data.
In 2008, University of Texas researchers Arvind Narayanan and Vitaly Shmatikov were able to reidentify (pdf) individuals from a dataset that Netflix had released, data that the video-rental and -streaming service had said was anonymized. The researchers said, “Using the Internet Movie Database as the source of background knowledge, we successfully identified the Netflix records of known users, uncovering their apparent political preferences and other potentially sensitive information.” Read more »