The recent Strava story - where the company revealed exercise data from users of its app - reiterated the importance of good data privacy practices, of both users and developers.
Strava is an exercise app that allows users to track their fitness activities via GPS devices (such as phones, watches, head units and heart rate monitors), analyse performance and compare their activity with that of their friends.
It offers interactivity, allowing users to share their location in real time when using Premium safety feature Strava Beacon and a feature called ‘use popularity’ where users can build a route of their own, or get someone else’s activity sent to their phone or GPS device.
Heatmaps, the feature that leads to ‘fit leaking’ – a term coined by researcher John Scott-Railton to describe a situation where fitness activities, recorded for personal benefit, emit signals that reveal sensitive and confidential information – are also built on user-generated data. It allows users to have an instant look at the routes that get the most activity.
Despite offering users activity and profile privacy selections – including an ability to opt out of heatmaps altogether – and despite aggregating and de-identifying information, sensitive locations can still be identified (inadvertently) if Strava members share their locations in areas with low activity density.
The heatmap visualisation shows patterns of movement but also reveals abnormal patterns, in military facilities for example, which usually appear to be quiet.
In situations such as this, it turns out that appropriate manipulation of data privacy settings concerns not only individual privacy, but also state security.
The response from Strava to the leak was to highlight the app’s privacy and safety tools, and provide more simplified privacy and safety features to help users to better control their data.
However, this shifts the responsibility to users who failed to manage their data privacy well. While this is important, it is the company’s responsibility to offer easy-to-understand and user-friendly tools, and to ensure quality outputs following data ethics. The lack of transparency about how the service operates deters users from adopting good data privacy practices.
Revealing as little information as possible seems to be a common problem in the digital tech sector (especially when trade secrets are considered as sources of profits).
In the 1980s, when the free and open source software movement started, it was about fighting against proprietary software which is blackboxed and locks the user into a certain piece of software product.
Free software activists believe that if users can read, study, modify and share source code freely, they can take better control over their computers.
After 30 years, the free software movement has shifted its focus towards developing an Internet-based computing environment that puts users’ privacy first, advocating that the liberty of the data owner should take priority over the profit of the service provider.
The concept of ‘taking back control’ can be applied to the Strava situation. If users understand how a piece of technology works and how their data is being used to power the ‘heatmap’ service, they are more likely to be motivated to take better control of the technology they use and the data they own. Hacker ethic, which acknowledges that sharing information and data responsibly is beneficial and helpful, also comes into play here.
Hackers challenge the status quo by using alternative methods to ‘break’ technologies. The disruption opens up a constructive dialogue, if respected and recognised by the company, for improvement.
What the Strava incident has taught us is that it is impossible to get a flawless service. Crowd-sourced, user-generated data, despite being anonymised, aggregated, de-identified and processed, can still be computed and exploited by applying common sense to analyse the visualised heatmaps.
We need to continue to test and question the boundary of privacy and security. When technologies are blackboxed and proprietary, when information about the service is not transparent, challenges from ‘citizen hackers’ are more important than ever to reveal the dark sides of such social technologies.
Yuwei Lin is a senior lecturer in the Division of Communications, Media and Culture in the Faculty of Arts and Humanities at the University of Stirling.