Search


  • Categories


  • Archives

    « Home

    Archive for the ‘Medical data’ Category

    What If the Rules About Newborn Blood Screenings Changed?

    Thursday, October 26th, 2017

    There has been an ongoing privacy and ethics debate over the unauthorized or undisclosed use of newborns’ blood samples for purposes other than the standard disease-screening, which includes about 30 conditions. Now, there’s a trial study, called BabySeq, from Brigham and Women’s Hospital that “uses genomic sequencing to screen for about 1,800 conditions, including some cancers,” CBS Miami reports.

    The privacy questions are clear: What happens to the DNA data — who keeps it, in what form, for how long — and who has access to it? The participants in the study have chosen to participate with, presumably, complete knowledge of the answers to these questions. But consider if the screening of 1,800 conditions, rather than the current 30, became the legal standard. This is a significant amount of highly personal information and there are substantial privacy issues.

    BabySeq co-director, Dr. Robert Green, has raised some of these issues. “We can’t predict what kind of discrimination is going to be occurring by the time your child grows up,” Green said. “We can’t predict whether there’s some sort of privacy breaches, this information gets out and is used against your child in some sort of future scenario. And we, most importantly, we can’t predict the information’s accurate.” Read more »

    You Could Be Penalized for Refusing to Give Genetic Data to Your Employer

    Thursday, March 16th, 2017

    In 2008, President George W. Bush signed the Genetic Information Nondiscrimination Act (Pub. L. 110-233). GINA restricts the collection and use of genetic information in a number of ways. GINA prohibits health insurance providers and employers from requiring genetic testing. Under the federal law, genetic data cannot be used to determine insurance premiums, eligibility for insurance, or employment.

    States have also passed laws to protect individuals’ genetic privacy. Shortly after the passage of GINA, Illinois passed what would become Public Act 095-0927 (pdf), “An Act concerning health,” which strengthened privacy protections already in place under the Illinois Genetic Information Privacy Act of 1998. And in 2011, California Gov. Jerry Brown (D) signed SB 559, the California Genetic Information Nondiscrimination Act (CalGINA) (pdf). Going beyond the federal GINA, CalGINA also prohibits genetic discrimination in housing, mortgage lending, employment, health insurance coverage, life insurance coverage, education, public accommodations, and elections.

    These laws are meant to protect employees’ privacy from employer access and to shield them from discrimination based on their genetic data, but the federal GINA could be undermined if a bill being considered in Congress becomes law. Read more »

    New Year? Time for a New Assessment of Your Privacy Setup.

    Tuesday, January 17th, 2017

    People use a lot of services and devices to transmit and retain sensitive personal information. A person could use daily: a work computer, a personal computer, multiple email addresses, a work cellphone, a personal cellphone, an e-reader or tablet, a fitness tracker or smart watch, and an Artificial Intelligence assistant (Amazon’s Echo, Apple’s Siri, Google’s Assistant, or Microsoft’s Cortana). The data retained or transmitted on these services and devices could include sensitive medical or other information, personal photos, financial data, and more.

    There’s also the issue of the collection of information that could lead to other data being learned. For example, I wrote recently about health-app data and the surprising results of scrutinizing it. A man was alarmed by his wife’s heart rate data, as collected by her Fitbit, and asked others for assistance analyzing it. One theory: She could be pregnant. Did you know that heart-rate changes could signal a pregnancy?

    Currently, there’s ongoing controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explains, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.” Arkansas police have served a warrant to Amazon, as they seek information recorded by a suspect’s Echo. Amazon has refused to comply with the warrant.  Read more »

    As biometrics use expands, privacy questions continue to fester

    Tuesday, April 19th, 2016

    As the costs of the technologies fall, biometric identification tools — such as fingerprint, iris or voice-recognition scanners — are increasingly being used in everyday life. There are significant privacy questions that arise as biometric data is collected and used, sometimes without the knowledge or consent of the individuals being scanned.

    Biometrics use has become more commonplace. Many smartphones, including iPhones, have fingerprint “touch” ID scanners that people can use instead of numeric passcodes. And law enforcement personnel have been using fingerprint scanners for years, both domestically and internationally. In the past few years, we’ve see banks capturing customers’ voice prints in order, the institutions say, to fight fraud. Or gyms asking members to identify themselves using their fingerprints. Reuters recently reported that companies are seeking to expand fingerprint-identification systems to credit cards and railway commuters.

    And the voluntariness of a person submitting his or her biometric has also been questioned. Do you realize when you’re calling your bank that you’re handing over your voice print? Another situation a few years ago in Washington, D.C., also raised at the issue of voluntariness. The District considered requiring that all visitors to its jail “have their fingerprints scanned and checked against law enforcement databases for outstanding warrants.” So if you wanted to visit a friend or relative who was in the D.C. jail, you would have to volunteer to submit your biometric data. The plan was dropped after strong criticism from the public and civil rights groups.

    Your biometric data can be gathered for any number of innocuous reasons. For example, I had to submit my fingerprints to obtain my law license, not because of a crime. Family members, roommates and business colleagues of crime victims have submitted fingerprints in order to rule out “innocent” fingerprints at a crime scene in a home or workplace. Some “trusted traveler” airport programs gather iris scans. Some companies use iris-recognition technology for their security systems. Read more »

    Who sees your health-app data? It’s hard to know.

    Thursday, March 24th, 2016

    Lots of people use personal health devices, such as Fitbits, or mobile health or wellness apps (there are a variety offered through Apple’s and Google’s app stores). There are important privacy and security questions about the devices and apps, because the data that they can gather can be sensitive — disease status, medication usage, glucose levels, fertility data, or location information as the devices track your every step on the way to your 10,000 steps-per-day goal. And the medical diagnoses drawn from such information can surprise people, especially the individuals using the apps and devices.

    For example, one man was concerned after reviewing his wife’s Fitbit data. He “noticed her heart rate was well above normal.” He thought the device might be malfunctioning, so he posted the data on message-board site Reddit and asked for analyses. One person theorized that his wife would be pregnant. The couple made a doctor’s appointment and confirmed the pregnancy.

    This case illustrates the sensitive medical data gathered by personal medical devices and apps that a person might not even realize is possible. Did you know that heart-rate changes could signal a pregnancy?

    And this isn’t the first time that sensitive information of Fitbit users has been inadvertently revealed. Five years ago, there was an uproar over Fitbit’s privacy settings when people who were logging their sexual activity as a form of exercise learned that the data was showing up in Google searches. Read more »

    Organizations Must Protect Against Insider Threats to Security

    Thursday, January 21st, 2016

    As personal information becomes more accessible and shareable through massive databases there is the question of security. Agencies and companies build protections against threats, but there is a unique problem with insider threats: Often, people are misusing or abusing their access privileges to private data rather than attempting to illegally gain access to the information.

    We’ve seen the problems that arise when insiders abuse or misuse their access privileges to individuals’ data and violate the individuals’ privacy rights. Last week, the Florida Times-Union reported that Jacksonville and a Highway Patrol trooper reached a settlement after she sued, accusing police of misusing their access to a driver’s license database to gather information on her and harass her.

    A similar situation is said to have occurred in Minnesota, where 104 officers from 18 agencies in the state accessed one woman’s “driver’s license record 425 times in what could be one of the largest private data breaches by law enforcement in history.” A state report later found such misuse was common.

    Federal databases also have the problem of insiders misusing or abusing their data-access privileges. A recent ProPublica investigation found a variety of privacy violations at Department of Veterans Affairs facilities. “Some VA employees have used their access to medical records as a weapon in disputes or for personal gain, incident reports show,” such as one case where health data was improperly accessed and used in a divorce proceeding. Other individuals misused their authority to access medical information after suicides or suicide attempts by fellow employees. Read more »