Search


  • Categories


  • Archives

    « Home

    Archive for the ‘Medical data’ Category

    One Insurance Company Is Betting Big on Customers Giving Up Personal Health-Tracking Data

    Tuesday, September 25th, 2018

    As people increasingly use personal fitness devices, such as Fitbits, or health-tracking apps, such as Strava, there has been increasing concern about individual medical privacy as the data is gathered and used, sometimes for purposes of which runners or cyclists were unaware. People have questioned where this data collection could lead.

    Recently, U.S. life insurance giant John Hancock announced one path for fitness tracking: To cut life insurance rates. Beginning next year, John Hancock, in partnership with Vitality Group, “will stop underwriting traditional life insurance and instead sell only interactive policies that track fitness and health data through wearable devices and smartphones,” Reuters reported. “Policyholders score premium discounts for hitting exercise targets tracked on wearable devices such as a Fitbit or Apple Watch and get gift cards for retail stores and other perks by logging their workouts and healthy food purchases in an app.”

    Currently, John Hancock’s program is voluntary and there are numerous other life insurance companies that offer traditional policies, which do not involve constantly tracking individuals’ health and fitness information through wearable devices. But how soon will this change, to where more and more people are pressured to give up such personal data, such daily information, in order to have policies to protect their families?  Read more »

    After Death, Who Can Access Your Fingerprints for Security Issues?

    Thursday, April 26th, 2018

    Two Florida detectives tried to use a dead man’s fingerprints to unlock his phone, the Tampa Bay Times reported, and that act raised privacy questions.

    Linus F. Phillip “was shot and killed [by a Largo, Fla., police officer] March 23 at a Wawa gas station after police said he tried to drive away when an officer was about to search him,” the Times reported. Later, two detectives came to the Sylvan Abbey Funeral Home in Clearwater with Phillip’s phone, according to Phillip’s fiancee, Victoria Armstrong. “They were taken to Phillip’s corpse. Then, they tried to unlock the phone by holding the body’s hands up to the phone’s fingerprint sensor,” the Times reported.

    Phillip’s fiancee is upset. She was not notified that the detectives would be coming to the funeral home, and the police did not get a warrant for their actions.

    Although the detectives’ actions have been criticized as unethical, they are legal because dead people have fewer rights than the living, especially concerning privacy and search and seizure. The courts have split on whether living defendants can be forced to use biometrics such as fingerprints or facial scans to unlock their mobile devices. (Another difference from the Phillips case is that these court cases involved warrants.) Read more »

    Fitness Apps Can Be Fun, But Who Else Is Seeing Your Personal Data?

    Wednesday, March 28th, 2018

    Recently, an Australian student publicized that Strava, a fitness app, had published online a Global Heat Map that “uses satellite information to map the locations and movements of subscribers to the company’s fitness service over a two-year period, by illuminating areas of activity,” according to the Washington Post. Strava “allows millions of users to time and map their workouts and to post them online for friends to see, and it can track their movements at other times,” the New York Times reports.

    The data, culled from Strava’s 27 million users (who own Fitbits and other wearable fitness devices), is not updated in real-time. Yet the map still raised privacy and security questions for Strava’s users.

    A similar case in 2011 concerning wearable device Fitbit also raised privacy questions about searchable fitness data. There was an uproar over Fitbit’s privacy settings when people who were logging their sexual activity as a form of exercise learned that the data was showing up in Google searches. And in 2014, Jawbone faced criticism after it published data about how many people wearing its fitness tracker woke up during an earthquake in Northern California. People questioned whether Jawbone’s privacy and data-sharing policies had disclosed such use of their health data.

    Fitness devices, including smartwatches, and mobile health or wellness apps are used by tens of millions of people worldwide. There are many such apps available in Apple’s and Google’s app stores. The data gathered can reveal much personal information about individuals. In the case of Strava, you could track patterns of activity over the two years’ worth of data. Read more »

    What If the Rules About Newborn Blood Screenings Changed?

    Thursday, October 26th, 2017

    There has been an ongoing privacy and ethics debate over the unauthorized or undisclosed use of newborns’ blood samples for purposes other than the standard disease-screening, which includes about 30 conditions. Now, there’s a trial study, called BabySeq, from Brigham and Women’s Hospital that “uses genomic sequencing to screen for about 1,800 conditions, including some cancers,” CBS Miami reports.

    The privacy questions are clear: What happens to the DNA data — who keeps it, in what form, for how long — and who has access to it? The participants in the study have chosen to participate with, presumably, complete knowledge of the answers to these questions. But consider if the screening of 1,800 conditions, rather than the current 30, became the legal standard. This is a significant amount of highly personal information and there are substantial privacy issues.

    BabySeq co-director, Dr. Robert Green, has raised some of these issues. “We can’t predict what kind of discrimination is going to be occurring by the time your child grows up,” Green said. “We can’t predict whether there’s some sort of privacy breaches, this information gets out and is used against your child in some sort of future scenario. And we, most importantly, we can’t predict the information’s accurate.” Read more »

    You Could Be Penalized for Refusing to Give Genetic Data to Your Employer

    Thursday, March 16th, 2017

    In 2008, President George W. Bush signed the Genetic Information Nondiscrimination Act (Pub. L. 110-233). GINA restricts the collection and use of genetic information in a number of ways. GINA prohibits health insurance providers and employers from requiring genetic testing. Under the federal law, genetic data cannot be used to determine insurance premiums, eligibility for insurance, or employment.

    States have also passed laws to protect individuals’ genetic privacy. Shortly after the passage of GINA, Illinois passed what would become Public Act 095-0927 (pdf), “An Act concerning health,” which strengthened privacy protections already in place under the Illinois Genetic Information Privacy Act of 1998. And in 2011, California Gov. Jerry Brown (D) signed SB 559, the California Genetic Information Nondiscrimination Act (CalGINA) (pdf). Going beyond the federal GINA, CalGINA also prohibits genetic discrimination in housing, mortgage lending, employment, health insurance coverage, life insurance coverage, education, public accommodations, and elections.

    These laws are meant to protect employees’ privacy from employer access and to shield them from discrimination based on their genetic data, but the federal GINA could be undermined if a bill being considered in Congress becomes law. Read more »

    New Year? Time for a New Assessment of Your Privacy Setup.

    Tuesday, January 17th, 2017

    People use a lot of services and devices to transmit and retain sensitive personal information. A person could use daily: a work computer, a personal computer, multiple email addresses, a work cellphone, a personal cellphone, an e-reader or tablet, a fitness tracker or smart watch, and an Artificial Intelligence assistant (Amazon’s Echo, Apple’s Siri, Google’s Assistant, or Microsoft’s Cortana). The data retained or transmitted on these services and devices could include sensitive medical or other information, personal photos, financial data, and more.

    There’s also the issue of the collection of information that could lead to other data being learned. For example, I wrote recently about health-app data and the surprising results of scrutinizing it. A man was alarmed by his wife’s heart rate data, as collected by her Fitbit, and asked others for assistance analyzing it. One theory: She could be pregnant. Did you know that heart-rate changes could signal a pregnancy?

    Currently, there’s ongoing controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explains, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.” Arkansas police have served a warrant to Amazon, as they seek information recorded by a suspect’s Echo. Amazon has refused to comply with the warrant.  Read more »