Recently, an Australian student publicized that Strava, a fitness app, had published online a Global Heat Map that “uses satellite information to map the locations and movements of subscribers to the company’s fitness service over a two-year period, by illuminating areas of activity,” according to the Washington Post. Strava “allows millions of users to time and map their workouts and to post them online for friends to see, and it can track their movements at other times,” the New York Times reports.
The data, culled from Strava’s 27 million users (who own Fitbits and other wearable fitness devices), is not updated in real-time. Yet the map still raised privacy and security questions for Strava’s users.
A similar case in 2011 concerning wearable device Fitbit also raised privacy questions about searchable fitness data. There was an uproar over Fitbit’s privacy settings when people who were logging their sexual activity as a form of exercise learned that the data was showing up in Google searches. And in 2014, Jawbone faced criticism after it published data about how many people wearing its fitness tracker woke up during an earthquake in Northern California. People questioned whether Jawbone’s privacy and data-sharing policies had disclosed such use of their health data.
Fitness devices, including smartwatches, and mobile health or wellness apps are used by tens of millions of people worldwide. There are many such apps available in Apple’s and Google’s app stores. The data gathered can reveal much personal information about individuals. In the case of Strava, you could track patterns of activity over the two years’ worth of data.
“Concentrations of light inside a base may indicate where troops live, eat or work, suggesting possible targets for enemies,” the Post reported. “Some analysts … warn that, although the map does not name the people who traced its squiggles and lines, individual users can easily be tracked, by cross-referencing their Strava data with other social media use. That could put individual members of the military at risk, even when they are not in war zones,” the Times said.
Some questioned whether Strava’s map had revealed previously unknown or sensitive military sites. After the public criticism, the company changed its heat map. Strava CEO James Quarles told Reuters the retooled map “will bar access to street-level details to anyone but registered Strava users.” Also, “Roads and trails with little activity will not show up on the revised map until several different users upload workouts in that area, the company said. The map will also be refreshed monthly to remove data people have made private.”
Aside from Strava’s map, there are other privacy and security implications for such health-tracking information. The data gathered can be sensitive – disease status, medication usage, glucose levels, fertility information, or location data as your steps are tracked throughout your day.
A 2016 case concerning Fitbit data showed just how revealing data from personal medical devices and apps can be. One man was concerned after reviewing his wife’s Fitbit data. He “noticed her heart rate was well above normal.” He thought the device might be malfunctioning, so he posted the info on message-board site Reddit and asked for analyses. One person theorized that his wife would be pregnant. The couple made a doctor’s appointment and confirmed the pregnancy. People who read about this sleuthing learned that heart-rate changes can signal a pregnancy.
Strava is just the latest example to show that people need to be well-informed about who is seeing their health data from these devices and apps. But that can be difficult to determine. In 2016, researchers from the Illinois Institute of Technology Chicago-Kent College of Law published a study of 211 diabetes apps in the Google Play store, “Privacy Policies of Android Diabetes Apps and Sharing of Health Information.” The research, conducted in 2014, found: “Most of the 211 apps (81%) did not have privacy policies. Of the 41 apps (19%) with privacy polices, not all of the provisions actually protected privacy (eg, 80.5% collected user data and 48.8% shared data. … Only 4 policies said they would ask users for permission to share data.” In some cases, the researchers found, the user data could be shared with partners and/or third parties and used for advertisement purposes.
And there is also the problem that such apps and devices can change their privacy or data-sharing policies, so the burden is on the user to stay up-to-date. Federal privacy law does not offer much assistance, because in most cases of wellness apps and some fitness devices, the federal Health Insurance Portability and Accountability Act of 1996 does not apply. This is because the sensitive medical information is generated by users not by entities covered by HIPAA (insurance plans, most medical providers and health-care clearinghouses) and businesses associated with covered entities.
Until laws are passed to better protect the privacy of this sensitive information, people should try to be vigilant about determining who sees their data and how it is being used or shared.