Search


  • Categories


  • Archives

    « Home

    Archive for the ‘Medical data’ Category

    After Death, Who Can Access Your Fingerprints for Security Issues?

    Thursday, April 26th, 2018

    Two Florida detectives tried to use a dead man’s fingerprints to unlock his phone, the Tampa Bay Times reported, and that act raised privacy questions.

    Linus F. Phillip “was shot and killed [by a Largo, Fla., police officer] March 23 at a Wawa gas station after police said he tried to drive away when an officer was about to search him,” the Times reported. Later, two detectives came to the Sylvan Abbey Funeral Home in Clearwater with Phillip’s phone, according to Phillip’s fiancee, Victoria Armstrong. “They were taken to Phillip’s corpse. Then, they tried to unlock the phone by holding the body’s hands up to the phone’s fingerprint sensor,” the Times reported.

    Phillip’s fiancee is upset. She was not notified that the detectives would be coming to the funeral home, and the police did not get a warrant for their actions.

    Although the detectives’ actions have been criticized as unethical, they are legal because dead people have fewer rights than the living, especially concerning privacy and search and seizure. The courts have split on whether living defendants can be forced to use biometrics such as fingerprints or facial scans to unlock their mobile devices. (Another difference from the Phillips case is that these court cases involved warrants.) Read more »

    Fitness Apps Can Be Fun, But Who Else Is Seeing Your Personal Data?

    Wednesday, March 28th, 2018

    Recently, an Australian student publicized that Strava, a fitness app, had published online a Global Heat Map that “uses satellite information to map the locations and movements of subscribers to the company’s fitness service over a two-year period, by illuminating areas of activity,” according to the Washington Post. Strava “allows millions of users to time and map their workouts and to post them online for friends to see, and it can track their movements at other times,” the New York Times reports.

    The data, culled from Strava’s 27 million users (who own Fitbits and other wearable fitness devices), is not updated in real-time. Yet the map still raised privacy and security questions for Strava’s users.

    A similar case in 2011 concerning wearable device Fitbit also raised privacy questions about searchable fitness data. There was an uproar over Fitbit’s privacy settings when people who were logging their sexual activity as a form of exercise learned that the data was showing up in Google searches. And in 2014, Jawbone faced criticism after it published data about how many people wearing its fitness tracker woke up during an earthquake in Northern California. People questioned whether Jawbone’s privacy and data-sharing policies had disclosed such use of their health data.

    Fitness devices, including smartwatches, and mobile health or wellness apps are used by tens of millions of people worldwide. There are many such apps available in Apple’s and Google’s app stores. The data gathered can reveal much personal information about individuals. In the case of Strava, you could track patterns of activity over the two years’ worth of data. Read more »

    What If the Rules About Newborn Blood Screenings Changed?

    Thursday, October 26th, 2017

    There has been an ongoing privacy and ethics debate over the unauthorized or undisclosed use of newborns’ blood samples for purposes other than the standard disease-screening, which includes about 30 conditions. Now, there’s a trial study, called BabySeq, from Brigham and Women’s Hospital that “uses genomic sequencing to screen for about 1,800 conditions, including some cancers,” CBS Miami reports.

    The privacy questions are clear: What happens to the DNA data — who keeps it, in what form, for how long — and who has access to it? The participants in the study have chosen to participate with, presumably, complete knowledge of the answers to these questions. But consider if the screening of 1,800 conditions, rather than the current 30, became the legal standard. This is a significant amount of highly personal information and there are substantial privacy issues.

    BabySeq co-director, Dr. Robert Green, has raised some of these issues. “We can’t predict what kind of discrimination is going to be occurring by the time your child grows up,” Green said. “We can’t predict whether there’s some sort of privacy breaches, this information gets out and is used against your child in some sort of future scenario. And we, most importantly, we can’t predict the information’s accurate.” Read more »

    You Could Be Penalized for Refusing to Give Genetic Data to Your Employer

    Thursday, March 16th, 2017

    In 2008, President George W. Bush signed the Genetic Information Nondiscrimination Act (Pub. L. 110-233). GINA restricts the collection and use of genetic information in a number of ways. GINA prohibits health insurance providers and employers from requiring genetic testing. Under the federal law, genetic data cannot be used to determine insurance premiums, eligibility for insurance, or employment.

    States have also passed laws to protect individuals’ genetic privacy. Shortly after the passage of GINA, Illinois passed what would become Public Act 095-0927 (pdf), “An Act concerning health,” which strengthened privacy protections already in place under the Illinois Genetic Information Privacy Act of 1998. And in 2011, California Gov. Jerry Brown (D) signed SB 559, the California Genetic Information Nondiscrimination Act (CalGINA) (pdf). Going beyond the federal GINA, CalGINA also prohibits genetic discrimination in housing, mortgage lending, employment, health insurance coverage, life insurance coverage, education, public accommodations, and elections.

    These laws are meant to protect employees’ privacy from employer access and to shield them from discrimination based on their genetic data, but the federal GINA could be undermined if a bill being considered in Congress becomes law. Read more »

    New Year? Time for a New Assessment of Your Privacy Setup.

    Tuesday, January 17th, 2017

    People use a lot of services and devices to transmit and retain sensitive personal information. A person could use daily: a work computer, a personal computer, multiple email addresses, a work cellphone, a personal cellphone, an e-reader or tablet, a fitness tracker or smart watch, and an Artificial Intelligence assistant (Amazon’s Echo, Apple’s Siri, Google’s Assistant, or Microsoft’s Cortana). The data retained or transmitted on these services and devices could include sensitive medical or other information, personal photos, financial data, and more.

    There’s also the issue of the collection of information that could lead to other data being learned. For example, I wrote recently about health-app data and the surprising results of scrutinizing it. A man was alarmed by his wife’s heart rate data, as collected by her Fitbit, and asked others for assistance analyzing it. One theory: She could be pregnant. Did you know that heart-rate changes could signal a pregnancy?

    Currently, there’s ongoing controversy concerning the data possibly collected by an Amazon Echo. The Washington Post explains, “The Echo is equipped with seven microphones and responds to a ‘wake word,’ most commonly ‘Alexa.’ When it detects the wake word, it begins streaming audio to the cloud, including a fraction of a second of audio before the wake word, according to the Amazon website. A recording and transcription of the audio is logged and stored in the Amazon Alexa app and must be manually deleted later.” Arkansas police have served a warrant to Amazon, as they seek information recorded by a suspect’s Echo. Amazon has refused to comply with the warrant.  Read more »

    As biometrics use expands, privacy questions continue to fester

    Tuesday, April 19th, 2016

    As the costs of the technologies fall, biometric identification tools — such as fingerprint, iris or voice-recognition scanners — are increasingly being used in everyday life. There are significant privacy questions that arise as biometric data is collected and used, sometimes without the knowledge or consent of the individuals being scanned.

    Biometrics use has become more commonplace. Many smartphones, including iPhones, have fingerprint “touch” ID scanners that people can use instead of numeric passcodes. And law enforcement personnel have been using fingerprint scanners for years, both domestically and internationally. In the past few years, we’ve see banks capturing customers’ voice prints in order, the institutions say, to fight fraud. Or gyms asking members to identify themselves using their fingerprints. Reuters recently reported that companies are seeking to expand fingerprint-identification systems to credit cards and railway commuters.

    And the voluntariness of a person submitting his or her biometric has also been questioned. Do you realize when you’re calling your bank that you’re handing over your voice print? Another situation a few years ago in Washington, D.C., also raised at the issue of voluntariness. The District considered requiring that all visitors to its jail “have their fingerprints scanned and checked against law enforcement databases for outstanding warrants.” So if you wanted to visit a friend or relative who was in the D.C. jail, you would have to volunteer to submit your biometric data. The plan was dropped after strong criticism from the public and civil rights groups.

    Your biometric data can be gathered for any number of innocuous reasons. For example, I had to submit my fingerprints to obtain my law license, not because of a crime. Family members, roommates and business colleagues of crime victims have submitted fingerprints in order to rule out “innocent” fingerprints at a crime scene in a home or workplace. Some “trusted traveler” airport programs gather iris scans. Some companies use iris-recognition technology for their security systems. Read more »