Search


  • Categories


  • Archives

    « Home

    Archive for the ‘Medical data’ Category

    Insider Access to Sensitive Data Must Be Carefully Controlled to Avoid Security Threats

    Friday, May 31st, 2019

    Recently, a news report said employees of multimedia messaging app Snapchat were using internal tools to violate the privacy rights of users, shining a light on the security threat that can arise from knowledgeable insiders. But the problem of insiders misusing or abusing their access privileges in order to invade the privacy rights of individuals is not new. 

    In Snapchat’s case, Motherboard reported: “Several departments inside social media giant Snap have dedicated tools for accessing user data, and multiple employees have abused their privileged access to spy on Snapchat users.” Sources and emails obtained by the news outlet, “described internal tools that allowed Snap employees at the time to access user data, including in some cases location information, their own saved Snaps and personal information such as phone numbers and email addresses. Snaps are photos or videos that, if not saved, typically disappear after being received (or after 24 hours if posted to a user’s Story).”

    But Snapchat is hardly the first private company to face problems with employees abusing or misusing their security access privileges to violate customers’ privacy. And it is not just technology companies facing these issues. 

    In 2014, the Indiana Court of Appeals upheld a jury’s verdict against a Walgreen concerning a pharmacy employee who accessed the medical record of a customer and gave the prescription information to the customer’s ex-boyfriend, whom the employee was dating. In the case, Hinchy v. Walgreen Co., et al. (pdf), Walgreen was found liable for negligent supervision and retention and invasion of privacy. In 2015, the court, upon rehearing, affirmed the original decision (pdf). 

    Read more »

    States Are Taking Privacy Into Their Own Hands

    Tuesday, April 30th, 2019

    When people consider data protection officers and privacy regulators, they mostly think about foreign agencies who have made headlines with their battles to protect sensitive personal information from misuse or abuse, such as the U.K. Information Commissioner’s Office or France’s Commission nationale de l’informatique et des libertés (CNIL). In January, the CNIL fined Google 50 million euros “in accordance with the General Data Protection Regulation (GDPR), for lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.” And earlier this month, the ICO fined Bounty UK Limited 400,000 pounds because the pregnancy and parenting club “illegally shar[ed] personal information belonging to more than 14 million people.” Last year, the Hong Kong privacy commissioner launched an investigation into “the massive data breach at Cathay Pacific Airways that affected millions of its passengers.”

    Although the data protection agencies can be restricted in their efforts in many ways, and there are questions about the adequacy of some of them, it is notable that these countries have a national agency to handle the privacy and security of sensitive personal data. They also have data protection officers at lower levels of government.

    In the United States, there is no one information protection agency at the federal level. The responsibility is splintered, and the agencies’ power can be handicapped. Some of the agencies include the Privacy and Civil Liberties Oversight Board, the Department of Homeland Security’s Privacy Office, the Department of Health and Human Services, and the Federal Trade Commission.

    The PCLOB was recommended by the 9/11 Commission, and the board was created in 2004 and placed within the White House. In 2008, Congress passed and President Bush signed the “Implementing the 9/11 Commission Recommendations Act of 2007,” which took the Privacy and Civil Liberties Oversight Board out of the White House and established it “as an independent agency within the executive branch.” Although it has been hobbled throughout its history by vacancies, it has released reports on the National Security Agency’s bulk telephone records surveillance program and a Section 702 of FISA surveillance program. 

    Read more »

    One Insurance Company Is Betting Big on Customers Giving Up Personal Health-Tracking Data

    Tuesday, September 25th, 2018

    As people increasingly use personal fitness devices, such as Fitbits, or health-tracking apps, such as Strava, there has been increasing concern about individual medical privacy as the data is gathered and used, sometimes for purposes of which runners or cyclists were unaware. People have questioned where this data collection could lead.

    Recently, U.S. life insurance giant John Hancock announced one path for fitness tracking: To cut life insurance rates. Beginning next year, John Hancock, in partnership with Vitality Group, “will stop underwriting traditional life insurance and instead sell only interactive policies that track fitness and health data through wearable devices and smartphones,” Reuters reported. “Policyholders score premium discounts for hitting exercise targets tracked on wearable devices such as a Fitbit or Apple Watch and get gift cards for retail stores and other perks by logging their workouts and healthy food purchases in an app.”

    Currently, John Hancock’s program is voluntary and there are numerous other life insurance companies that offer traditional policies, which do not involve constantly tracking individuals’ health and fitness information through wearable devices. But how soon will this change, to where more and more people are pressured to give up such personal data, such daily information, in order to have policies to protect their families?  Read more »

    After Death, Who Can Access Your Fingerprints for Security Issues?

    Thursday, April 26th, 2018

    Two Florida detectives tried to use a dead man’s fingerprints to unlock his phone, the Tampa Bay Times reported, and that act raised privacy questions.

    Linus F. Phillip “was shot and killed [by a Largo, Fla., police officer] March 23 at a Wawa gas station after police said he tried to drive away when an officer was about to search him,” the Times reported. Later, two detectives came to the Sylvan Abbey Funeral Home in Clearwater with Phillip’s phone, according to Phillip’s fiancee, Victoria Armstrong. “They were taken to Phillip’s corpse. Then, they tried to unlock the phone by holding the body’s hands up to the phone’s fingerprint sensor,” the Times reported.

    Phillip’s fiancee is upset. She was not notified that the detectives would be coming to the funeral home, and the police did not get a warrant for their actions.

    Although the detectives’ actions have been criticized as unethical, they are legal because dead people have fewer rights than the living, especially concerning privacy and search and seizure. The courts have split on whether living defendants can be forced to use biometrics such as fingerprints or facial scans to unlock their mobile devices. (Another difference from the Phillips case is that these court cases involved warrants.) Read more »

    Fitness Apps Can Be Fun, But Who Else Is Seeing Your Personal Data?

    Wednesday, March 28th, 2018

    Recently, an Australian student publicized that Strava, a fitness app, had published online a Global Heat Map that “uses satellite information to map the locations and movements of subscribers to the company’s fitness service over a two-year period, by illuminating areas of activity,” according to the Washington Post. Strava “allows millions of users to time and map their workouts and to post them online for friends to see, and it can track their movements at other times,” the New York Times reports.

    The data, culled from Strava’s 27 million users (who own Fitbits and other wearable fitness devices), is not updated in real-time. Yet the map still raised privacy and security questions for Strava’s users.

    A similar case in 2011 concerning wearable device Fitbit also raised privacy questions about searchable fitness data. There was an uproar over Fitbit’s privacy settings when people who were logging their sexual activity as a form of exercise learned that the data was showing up in Google searches. And in 2014, Jawbone faced criticism after it published data about how many people wearing its fitness tracker woke up during an earthquake in Northern California. People questioned whether Jawbone’s privacy and data-sharing policies had disclosed such use of their health data.

    Fitness devices, including smartwatches, and mobile health or wellness apps are used by tens of millions of people worldwide. There are many such apps available in Apple’s and Google’s app stores. The data gathered can reveal much personal information about individuals. In the case of Strava, you could track patterns of activity over the two years’ worth of data. Read more »

    What If the Rules About Newborn Blood Screenings Changed?

    Thursday, October 26th, 2017

    There has been an ongoing privacy and ethics debate over the unauthorized or undisclosed use of newborns’ blood samples for purposes other than the standard disease-screening, which includes about 30 conditions. Now, there’s a trial study, called BabySeq, from Brigham and Women’s Hospital that “uses genomic sequencing to screen for about 1,800 conditions, including some cancers,” CBS Miami reports.

    The privacy questions are clear: What happens to the DNA data — who keeps it, in what form, for how long — and who has access to it? The participants in the study have chosen to participate with, presumably, complete knowledge of the answers to these questions. But consider if the screening of 1,800 conditions, rather than the current 30, became the legal standard. This is a significant amount of highly personal information and there are substantial privacy issues.

    BabySeq co-director, Dr. Robert Green, has raised some of these issues. “We can’t predict what kind of discrimination is going to be occurring by the time your child grows up,” Green said. “We can’t predict whether there’s some sort of privacy breaches, this information gets out and is used against your child in some sort of future scenario. And we, most importantly, we can’t predict the information’s accurate.” Read more »