March 16th, 2017
In 2008, President George W. Bush signed the Genetic Information Nondiscrimination Act (Pub. L. 110-233). GINA restricts the collection and use of genetic information in a number of ways. GINA prohibits health insurance providers and employers from requiring genetic testing. Under the federal law, genetic data cannot be used to determine insurance premiums, eligibility for insurance, or employment.
States have also passed laws to protect individuals’ genetic privacy. Shortly after the passage of GINA, Illinois passed what would become Public Act 095-0927 (pdf), “An Act concerning health,” which strengthened privacy protections already in place under the Illinois Genetic Information Privacy Act of 1998. And in 2011, California Gov. Jerry Brown (D) signed SB 559, the California Genetic Information Nondiscrimination Act (CalGINA) (pdf). Going beyond the federal GINA, CalGINA also prohibits genetic discrimination in housing, mortgage lending, employment, health insurance coverage, life insurance coverage, education, public accommodations, and elections.
These laws are meant to protect employees’ privacy from employer access and to shield them from discrimination based on their genetic data, but the federal GINA could be undermined if a bill being considered in Congress becomes law. Read more »
February 27th, 2017
Good security is difficult. There are insider and outsider threats to prepare for, and the best defense includes continuous upgrades of security systems. A recent federal indictment concerning an alleged 18-year drug-smuggling operation among airport and Transportation Security Administration employees shows the value of strong security protocols that are changed and upgraded often enough that they cannot be easily circumvented by knowledgable insiders.
The use of airport and airline employees to smuggle drugs and other illicit contraband is not new. For example, a decade ago there was a scandal at an airport in Florida because airline baggage handlers were able to smuggle guns and drugs onto a plane. According to court documents, in 2007, two Comair baggage handlers were able to carry a duffel bag containing 14 guns and 8 pounds of marijuana onto a commercial plane in Orlando that was headed for San Juan, Puerto Rico. The men avoided detection, because they are airline baggage handlers who used their uniforms and legally issued identification cards to bypass security screeners and enter a restricted area before loading the contraband onto a plane. The men, who had passed federal background checks, used their knowledge of airport security protocols. The security protocols failed, and the men were caught because a source called a tip into the police.
Earlier that year, CBS News had revealed that “unlike passengers, pilots and flight attendants, some 700,000 airport workers with ID badges are allowed to completely bypass airport screening areas at virtually all our nation’s 452 commercial airlines.” Shortly after the Comair arrests, airports in Florida strengthened security protocols for employees and the Transportation Security Administration also heightened screening requirements. Read more »
January 24th, 2017
International Data Privacy Day is Saturday. There are a variety of events occurring this week to celebrate, including a live-streamed event from Twitter in San Francisco on Friday. Visit the official site to find events near your area. Take the time to think about how privacy is important in your life and how you can protect your rights from being infringed upon. Please also donate to any number of organizations out there trying to protect your privacy rights.
January 17th, 2017
People use a lot of services and devices to transmit and retain sensitive personal information. A person could use daily: a work computer, a personal computer, multiple email addresses, a work cellphone, a personal cellphone, an e-reader or tablet, a fitness tracker or smart watch, and an Artificial Intelligence assistant (Amazon’s Echo, Apple’s Siri, Google’s Assistant, or Microsoft’s Cortana). The data retained or transmitted on these services and devices could include sensitive medical or other information, personal photos, financial data, and more.
There’s also the issue of the collection of information that could lead to other data being learned. For example, I wrote recently about health-app data and the surprising results of scrutinizing it. A man was alarmed by his wife’s heart rate data, as collected by her Fitbit, and asked others for assistance analyzing it. One theory: She could be pregnant. Did you know that heart-rate changes could signal a pregnancy?
Currently, there’s ongoing controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explains, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.” Arkansas police have served a warrant to Amazon, as they seek information recorded by a suspect’s Echo. Amazon has refused to comply with the warrant. Read more »
November 23rd, 2016
The idea of secret surveillance from a distance isn’t new. For centuries, there have been undercover agents. Subsequently, there came hidden cameras and microphones. But there were limitations to this secret surveillance — such as the physical constraints of a human or camera located far from the person being watched. As surveillance technology has become more sophisticated, however, it is becoming easier to identify, watch, listen to, and judge people from a distance.
The judgment portion is, in part, based on biometric facial-recognition technology that incorporates expression recognition. For the unseen eyes, it’s no longer just about identifying a person, but also about watching their emotional responses. This type of facial-recognition tech gained attention a few years ago when Microsoft filed a patent for technology that would track individuals’ emotions and target advertising and marketing as based upon a person’s mood.
“Degrees of emotion can vary — a user can be ‘very angry’ or ‘slightly angry’ — as well as the duration of the mood. Advertisers can target people ‘happy for one hour’ or ‘happy for 24 hours,’” the Toronto Star reported in 2012. Four years later, the mood-identification technology can be bought off the shelf, as NBC News explains in a story about “a new immersive experience for moviegoers.” Read more »
October 17th, 2016
For years, companies and institutions have been using “anonymization” or “deidentification” techniques and processes to release data concerning individuals, saying that the techniques will protect personal privacy and preclude the sensitive information from being linked back to an individual. Yet we have seen time and again that these processes haven’t worked.
For almost two decades, researchers have told us that anonymization of private information has significant problems, and individuals can be re-identified and have their privacy breached. (I wrote a blog post last year detailing some of the research concerning re-identificaiton of anonymized data sets.)
Recently, Australian Attorney General George Brandis announced that he would seek to amend the country’s Privacy Act to “create a new criminal offence of re-identifying de-identified government data. It will also be an offence to counsel, procure, facilitate, or encourage anyone to do this, and to publish or communicate any re-identified dataset.”
According to the Guardian, the “impetus” for this announcement was a recent privacy problem with deidentified Medicare data, a problem uncovered by researchers. “A copy of an article published by the researchers outlines how every single Medicare data code was able to be reidentified by linking the dataset with other available information,” the Guardian reported. Read more »