Most of the 22 million people who reportedly use Fitbits or other fitness trackers probably aren’t thinking about what their daily jog or morning walk to work might mean for international security. But a recent, startling revelation about what people’s fitness stats are revealing to the world is poised to change that — and maybe even to permanently alter our relationship to data privacy.
Over the weekend, news broke that a fitness-centered social media app called Strava had quietly, inadvertently divulged the locations of secret military bases around the world via its recently updated global “heat map.” Anyone wearing a fitness tracker like Fitbit with the Strava app installed can make their location data public through the app, which allows it to be included in the company’s global heat map.
As it turns out, when you put enough soldiers in one place and they exercise in the same locations every day, their collective composite heat map can reveal things over time that no one was expecting. For example, here’s a Strava-generated heat map (note the blue lines) of soldiers using the app on either side of the Korean Demilitarized Zone:
The location of soldiers stationed in sensitive areas of the world isn’t the type of info that any country’s military would typically want to have readily available online, for any curious onlooker to see. But that’s exactly what happened with Strava, and exactly why many technology researchers are now calling for a serious reevaluation of how we think about sharing our personal data.
It’s a development tied to concerns about privacy and data collection and aggregation that researchers have been raising for years — but the revelation that individual data collection that may have seemed harmless in isolation could upend closely guarded government secrets was a wakeup call to many people who never considered what the larger ramifications of sharing their location data with apps like Strava could be.
Strava’s heat map has been online for months. But something curious suddenly emerged when a student military analyst took a closer look.
Strava was launched in 2008 by two Harvard graduates who wanted to combine social media sharing with advances in fitness data tracking. The app allows users to compare their stats against other those of Strava users who’ve worked out in the same location. It’s been written about swooningly in the past as a boon to athletes looking to challenge themselves while competing in a kind of real-life online role-playing game; your opponents are other athletes like you, and the game is your daily exercise routine.
Strava first made its global heat map public in 2015 and then updated it this past November, complete with boasts that it now contained six times as much data than before, as well as stunning images of human-powered thermal energy. (You can see Burning Man, the company blog proudly pointed out.) Even if you downloaded Strava months ago, used it once, and then never opened it again, your fitness data might have been part of the 3 trillion GPS points that went into Strava’s heat map update, culled from the app’s reported 27 million users. And since making your location data public is something you have to opt out of rather than opt into — and opting out is somewhat difficult, at that — it’s possible that many Strava users might have never known their location data was being collected to begin with.
At first, Strava’s heat map did/didn’t garner much notice — it was just a nifty feature of the app that was fun to play around with, while the rest of the app did the heavy lifting in allowing people to connect with each other through fitness challenges and location-based leaderboards.
But then a 20-year-old international security student named Nathan Ruser realized that the 3 trillion data points used to build the heat map were so specific that they could reveal sensitive military information. Discussing his findings on Twitter, Ruser pointed out that the map revealed the apparent locations of secret military bases owned by the US and other governments in places like Russia, Afghanistan, and Turkey, noting that it seemed to pose an unprecedented security liability to soldiers:
Security analysts quickly picked up the thread:
Anyone, it seemed, could check out Strava’s heat map and find clear images of locations that had been restricted on other satellite maps:
In fact, Strava’s data map was so granular that people could glean significant details about such compounds, going way beyond their general location. As the Guardian pointed out, “Zooming in on one of the larger bases clearly reveals its internal layout, as mapped out by the tracked jogging routes of numerous soldiers.”
Meanwhile, the Daily Beast speculated that because Strava tracks individual user data, the ability to tie an individual soldier’s daily fitness routine to their location could aid and abet international military or spy tactics to a previously unseen degree.
In reaction to the news, the international coalition against ISIS issued a statement to the press declaring that it would be revising its guidelines regarding the use of fitness trackers.
“The rapid development of new and innovative information technologies enhances the quality of our lives but also poses potential challenges to operational security and force protection,” the coalition told the Washington Post.
Strava made it clear, both in its initial announcement about the heat map update in November and in a later statement released to CNN, that the app “excludes activities that have been marked as private and user-defined privacy zones.” It also published a blog post urging users to double-check their privacy settings and, after a full day had passed, issued a follow-up blog post in which it explicitly took responsibility for the way it used “the data you share with us.” Strava further emphasized in that post that it is “committed to working with military and government officials to address potentially sensitive data.”
But while Strava’s response and the push for action from international coalitions might suggest there’s no need for panic, the entire scenario highlights a huge problem with the current state of data collection — one that some technology researchers have been debating for years.
The Strava case demonstrates a real potential problem with using individual consent for collective data mapping
The basic concept of data privacy is built around the assumption that it’s possible to consent to the way your data is being used. The Strava case makes it clear that such an assumption might be outdated, as it’s fueled a belief held by many in the data privacy sphere that individuals can’t consent to use cases of their data they don’t yet know about.
To generate its heat map, Strava used every piece of location data that had ever been shared with it, by every user who’d ever downloaded the app and consented to its privacy policies and data collection. (And because the data was already being stored and compiled by the app, the map itself cost only a few hundred dollars to produce.)
Think about that for a moment. Most of us probably don’t remember what apps we had on our phones a decade ago, or even five years ago. If you downloaded Strava when it launched in 2008 and never really got into it, today you may not even remember that you were ever part of its user base. What’s more, most of us probably aren’t considering, whenever we think, ‘Sure, share my location data,’ that 10 years out, that location data might be used in totally unexpected ways.
But that’s just what happened with Strava. A 2015 article by a team of data privacy researchers explained the problem succinctly:
In other words, data you agreed to share in 2008 could increasingly be repurposed and repackaged for uses you could never have foreseen in 2018.
And as tech writer and academic Zeynep Tufekci spelled out on Twitter, the Strava incident is a perfect case study for the implications of such unknowable outcomes, as it highlights a major flaw in thinking about what data our smartphone and wearable apps are collecting only on an individual level:
Tufekci also blasted the deflection of Strava’s response by declaring that “‘Check your privacy settings’ — which is your standard Silicon Valley response — is no response.”
“In the digital age, data privacy simply cannot be negotiated and consented to at the individual level,” Tufekci said.
By the same token, the Strava case shouldn’t be viewed simply as a problem for Strava. This particular app’s data collection methods and use-case scenarios are no more or less ominous than those of any other app out there. Regardless of the purpose and original scope of the app, and the perfectly innocent reason the app could want to collect your data on an individual level, it’s essentially impossible to predict what that data might reveal once you line it up alongside similar data from millions of other users — or how that information might wind up being used many years into the future.
What is clear is that incidents like this one highlight the need for enhanced scrutiny and critical thinking from everyone involved — from app developers and researchers to everyday people agreeing to share their data. Our technology is growing ever more sophisticated, while our ways of thinking about that technology are lagging. The questions to ponder, as you think about how apps might be using your private info, aren’t limited to “How is my data being used?” They now have to include “How could my data be used in the future?
And if the future potential isn’t clear (and it probably won’t be), it might be best to opt out.
Sourse: vox.com