How far away are we from controlling our own personal data stores, allowing us to choose to share all or part of our data as we see fit?
It might seem a long way off, but technology will inevitably and inexorably take us there. However, the pace of change will be affected by the public appetite for change and by the tech giants, whose business models are based on owning and using our data – and whose support for a wholesale shift in the control of data might bring turkeys and Christmas to mind.
Gillian Docherty, CEO of The Data Lab, thinks the Cambridge Analytica/Facebook scandal might have accelerated the pace of change by getting more people to think more carefully about who controls their data. “I think we are still exploring the consequences for the use of our data as individuals, as patients, as participants in society and so on,” she says.
“Different people have different risk profiles for the use of their data. One of the benefits, if that’s the right word, of the Cambridge Analytica/Facebook scandal is that people are now having the dialogue and are engaged in a conversation which would never have happened before. It made it real for people and made them ask questions.”
Docherty thinks publicity around the use of data has made more people consider their digital actions: “We’ve all ticked the ‘terms and conditions’ or ‘I accept cookies’ box without fully understanding the implications. More people have started asking ‘What does this mean?’”
Docherty explored the idea of a personal data store in a TED Talk in Glasgow in 2017, where she looked forward 20 years to a time when technology would dominate every facet of the life of her daughter Charley, who was six (and three-quarters) at the time. “She will have a personal data store and only she has access to that data and what it is used for,” Docherty says, suggesting the store could include very detailed personal health data – about diet, exercise and details of heart rate, blood pressure and other key medical information – which would be derived from a series of implanted chips and/or biometric tattoos.
Charley would have an AI personal assistant to help her access and use her personal data in a way that fitted in with her life and what mattered to her. “This might mean granting access to different elements of data to specific organisations,” says Docherty. “That might be banks, utility businesses, mobile phone providers, your local authority, your doctor and so on. We are already seeing elements of this in open banking, where personal data is being shared.”
So how close are we to personal data stores – or sovereign data profiles as they are sometimes called? “There is lots of interesting work going on around data trust, about putting data back in the hands of individuals and enabling them to authorise access to portions of their data, provided there is a value exchange where the consumer derives some benefit. I think we are likely to see much more of that,” says Docherty.
However, she thinks a wholesale change in approach to data access will take time. “That change in approach would mean putting data back in the hands (and data stores) of the individual – and moving from where we are now to that point involves quite a major shift. I think we will see elements of this and innovation in this space – but I’m not sure we have really reached out to the general public yet and that’s why I think there is a long way to go.
“Also, the big guys, like Facebook, are unlikely to support that – because at the moment, your data is their product, their business model. Facebook as a product is free, because our data is the product. Will we see changing business models – a future where we pay for Facebook but own our own data? I’m not sure, as it would be a complete pivot in terms of the business model. We are working through these conversations at the moment.”
Callum Sinclair, Partner and Head of Technology at legal firm Burness Paull, has expressed scepticism about the public’s desire to control their own data. He says: “Scandals involving big social media players like Facebook mean that the misuse of data is very much in the public consciousness – but are consumers prepared to do anything about that... to stop using a platform?
“In the case of social media giants, the answer at the moment seems to be no. I am concerned that the trust deficit is ebbing away with the next generation of digital natives, who are focused on convenience and don’t seem too worried about what’s going on with their data behind the scenes.
“There seems to be a lack of critical thinking and challenge, and I fear that it might take a major data scandal and actual harm before my children’s generation really wake up to that.”
Docherty agrees there is a tough balancing act between data privacy and day-to-day convenience – but argues a personal data store could actually be more convenient. “When you are changing suppliers – for a phone or banking or your utilities – it can still be quite time-consuming and you might need to remember multiple passwords or look for documents,” she says. “If all your information was in a single data store or ‘pod’, it would be massively more convenient. Ironically, we have seen pressure to make things easier for consumers coming from the regulatory side, especially around utilities and banking. It’s a complex web of issues around convenience, fresh opportunities and privacy.”
In terms of the work going on in this area, Docherty highlights Mydex, which says its mission is “to empower individuals to manage their lives more effectively through convenient, trustworthy access and control of their personal data and how it is used by them and others”.
Solid, a project involving Sir Tim Berners-Lee, is working to help users and organisations separate their data from applications that use it – “to allow people to look at the same data with different apps at the same time [and] open brand new avenues for creativity, problem-solving, and commerce.”
Docherty continues: “There is a big appetite for it. We will start to see it becoming more available and more opportunities to participate, but people will need to go with their feet if we are to see this wholesale change.”
In ODI we trust
The Open Data Institute (ODI) has carried out the first detailed research project into data trusts – “legal structures that provide independent stewardship of data” – which are effectively an organisational version of personal data stores.
Jeni Tennison, chief executive of the ODI, says: “We only unlock the full value of data when it gets used, so we need to find good ways to share it more widely without putting people at risk.”
The trustees of a data trust are responsible for deciding what data to share and with whom – to support the trust’s purpose and its intended benefits.
ODI research, funded by Westminster, launched three pilots in January to examine whether a data trust could increase access to data while retaining trust. They focused on diverse challenges: tackling the illegal wildlife trade, reducing food waste, and analysing public services in a London borough.
The research found enthusiasm from the private, public and third sector – and suggested there might be circumstances where governmental or philanthropic organisations should mandate or fund data trusts for specific global, national or local challenges, such as climate change, UN sustainable development goals or understanding the impact of online advertising.
It also called for data trusts to have robust governance processes that balance accountability to users with effective, timely decision-making
Tennison says: “We’ve learnt a huge amount about how data trusts can help, but there’s more to do. We need to learn more about how they should be monitored, audited and regulated. We also need more research into data access models, such as data co-operatives, data commons and people-led data trusts, which may sometimes be more appropriate.”