Lots of people use personal health devices, such as Fitbits, or mobile health or wellness apps (there are a variety offered through Apple’s and Google’s app stores). There are important privacy and security questions about the devices and apps, because the data that they can gather can be sensitive — disease status, medication usage, glucose levels, fertility data, or location information as the devices track your every step on the way to your 10,000 steps-per-day goal. And the medical diagnoses drawn from such information can surprise people, especially the individuals using the apps and devices.
For example, one man was concerned after reviewing his wife’s Fitbit data. He “noticed her heart rate was well above normal.” He thought the device might be malfunctioning, so he posted the data on message-board site Reddit and asked for analyses. One person theorized that his wife would be pregnant. The couple made a doctor’s appointment and confirmed the pregnancy.
This case illustrates the sensitive medical data gathered by personal medical devices and apps that a person might not even realize is possible. Did you know that heart-rate changes could signal a pregnancy?
And this isn’t the first time that sensitive information of Fitbit users has been inadvertently revealed. Five years ago, there was an uproar over Fitbit’s privacy settings when people who were logging their sexual activity as a form of exercise learned that the data was showing up in Google searches.
Forbes reported that Fitbit “has historically made users’ profiles and activity public by default, to encourage social sharing and competitiveness,” which led to the sexual activity data being public. Fitbit raced to make changes, telling Forbes, “We have also updated our default settings for new users for activity sharing to ‘private.’ ”
There’s also the question about who else knows the personal medical information gathered by your health-tracking device or app. It’s important to know what the privacy policies are for these devices and apps.
Recently, researchers from the Illinois Institute of Technology Chicago-Kent College of Law published a study of 211 diabetes apps in the Google Play store, “Privacy Policies of Android Diabetes Apps and Sharing of Health Information.” The research, conducted in 2014, found: “Most of the 211 apps (81%) did not have privacy policies. Of the 41 apps (19%) with privacy polices, not all of the provisions actually protected privacy (eg, 80.5% collected user data and 48.8% shared data. … Only 4 policies said they would ask users for permission to share data.” In some cases, the researchers found, the user data could be shared with partners and/or third parties and used for advertisement purposes. Also, the researchers did a transmission analysis of 65 of the 211 apps. They found: “Sensitive health information from diabetes apps (eg, insulin and blood glucose levels) was routinely collected and shared with third parties, with 56 of 65 apps (86.2%) placing tracking cookies.”
And in 2013, the Privacy Rights Clearinghouse reviewed the privacy policies of 43 health and fitness apps. PRC concluded: “It is clear that there are considerable privacy risks for users. … Consumers should not assume any of their data is private in the mobile app environment — even health data that they consider sensitive.” In its technical review, the organization found:
- Many apps send data in the clear – unencrypted — without user knowledge.
- Many apps connect to several third-party sites without user knowledge.
- Unencrypted connections potentially expose sensitive and embarrassing data to everyone on a network.
- Nearly three-fourths, or 72%, of the apps we assessed presented medium (32%) to high (40%) risk regarding personal privacy.
- The apps which presented the lowest privacy risk to users were paid apps. This is primarily due to the fact that they don’t rely solely on advertising to make money, which means the data is less likely to be available to other parties.
On Monday, the policy permitted a wider range of data sharing. “We may share information, including personally identifying information, with our Affiliates (companies that are part of our corporate groups of companies, including but not limited to Facebook) to help provide, understand, and improve our Services,” the policy says.
What about privacy laws? The problem is that the federal Health Insurance Portability and Accountability Act of 1996 does not apply to many health-tracking apps (and some fitness trackers) because the sensitive medical information is generated by users not by entities covered by HIPAA (insurance plans, most medical providers and health-care clearinghouses) and businesses associated with covered entities. (By the way, about six months ago, Fitbit announced that it had become HIPAA-compliant, noting that it could now “more effectively integrate with HIPAA-covered entities, including corporate wellness partners, health plans and self-insured employers.”)
Privacy advocates and government agencies have been wrestling with the problem of such data not being covered under HIPAA for years. In 2014, Federal Trade Commissioner Julie Brill said at an FTC event (PDF) on consumer-generated and -controlled health information: “There are some risks, I believe … with respect to health data flows that are appearing outside of HIPAA, outside of the medical context, and therefore outside of any regulatory regime that focuses specifically on health information.”
With the increasing usage of such personal fitness trackers and health apps, it is beyond time to update privacy laws to protect this sensitive information. People who are attempting to improve their health should not have to put their privacy at risk.
Possibly related posts: