Data Conference 2020: Unlocking confidence is crucial

Trust in the use of personal data is fragile and the public must be involved in discussions about how their information is utilised, The Scotsman’s Data Conference 2020 heard.
The public needs greater assurance over their data privacy, so that they can benefit from innovations that can improve their lives.The public needs greater assurance over their data privacy, so that they can benefit from innovations that can improve their lives.
The public needs greater assurance over their data privacy, so that they can benefit from innovations that can improve their lives.

Professor Shannon Vallor, Baillie Gifford chair in the ethics of data and AI at Edinburgh Futures Institute, was a keynote speaker at Doing Data Together: Ethical Collaboration through Covid-19 and Beyond.

She said: ”Public trust is one of the most vital issues for us to address. It is an invaluable and fragile thing.”

Hide Ad
Hide Ad

Highlighting a 2019 Scottish survey showing just 15 per cent of people trusted private sector organisations to use their personal data only for acceptable purposes, she observed: “This rose to 58 per cent for public sector organisations, but that’s a bare majority. We need much higher levels of public trust. Data-driven innovation is only going to work if the public is part of the journey.”

Vallor illustrated the fragility of data trust with three UK examples from 2020, including protests in England that algorithms had failed A-level students. “People were not identifying data-driven solutions as a source of public benefit, but as a source of injustice,” she said.

Vallor also noted that a “simple data management error” by Public Health England led to 16,000 positive Covid-19 tests being lost – as well as a possible 50,000 self-isolation notices missed –

and criticism of the incorrect threshold for exposure notification for the England contact tracing app.

“These undermine public confidence in the potential for data-driven innovation to work for them

“Mistakes happen but it’s about [having] ethical expertise, oversight and stress-testing in place to quickly detect problems and fix them.”

She also mentioned data used for political profiling, targeting and disinformation: “Cambridge Analytica [where millions of Facebook members’ personal data was used without their knowledge] was just the start.

“The public has become more aware of, and worried about, the way personal data has been used by state and commercial actors to allow them to be manipulated, spread falsehoods and groundless conspiracies and divide them politically.”

Hide Ad
Hide Ad

There was also concern about algorithms that unfairly discriminate against certain groups by reproducing human biases, she added.

However, Vallor said: “There are many examples of data-driven innovation which is beneficial, helpful and improves people’s lives – but people need reassurance that there is a safety net to protect them from abuses and misuses of data.” And she said it was crucial to remember that technology and data were not neutral.

She said: “[They] can change the environment in certain ways; shift power structures, transform human habits. Technology is always a reflection of what humans value. It reflects our judgements of what is worth building, accelerating or amplifying and who is worth helping.”

Data [cannot be] objective snapshots of reality, Vallor insisted. “They are human-curated observations and measures. There is always a lot missing from data and we need to recognise and make visible its limitations as well as its power and benefits.

“It’s never enough to say, ‘Just trust us, trust the data’. It has to be made trustworthy. That’s what it means to do data right; to put in the work, systems and expertise needed to provide that basis for public trust.”

Deploying technical acumen or computing power was never enough, she said. “We need increasingly sophisticated social, political and ethical skills and knowledge to do data right – put it in context, stress-test it, fix failures, curate it.”

Vallor went on to discuss how we could hold on to public trust around data in Scotland. She said this had to involve participation: “It must be part of [our] journey as a nation… not something imposed for the benefit of the few. What do the people of Scotland want data to do for them and what do they want to do with it?”

It also had to be just and equitable, she said. Scotland had to ensure vulnerable citizens did not bear the greatest costs of technological innovation, but received the greatest share of its benefits.

Hide Ad
Hide Ad

“We must avoid making a false choice between technological progress and human rights and freedoms,” she said. “If we sacrifice the latter for innovation, that’s not progress at all.”

Finally, Vallor said, data-driven innovation had to be equally ambitious and sustainable: “If we do data right, we will not pursue short-term gains that drain our long-term potential, but support social, political and material conditions to plant seeds for future generations to flourish.

“Humanity faces unprecedented challenges to sustaining the types of societies we want to share. Covid is just the beginning. Climate change is perhaps the largest challenge looming on our horizon.”

Data science and artificial intelligence are powerful new tools in helping address complex global challenges, but must be used wisely, Vallor said.

She concluded: “Used poorly, they will obscure more than they reveal. Used irresponsibly, they will undermine public trust and degrade our environment. Used unjustly, people will be deprived of their rights and opportunities to flourish.

“So data and AI ethics, doing data right, is what ensures data and AI can do the work for us that we need them to do. Ethics is not a roadblock to innovation, it’s our way forward.”

‘You can keep technology under democratic control’

Author and comedian Timandra Harkness attended the conference and argued that it is possible to keep data under democratic control.

She used the example of Oakland, California, where a privacy advisory commission was set up after authorities received funding for a surveillance programme, including CCTV, number plate recognition and more.

Hide Ad
Hide Ad

“There were good reasons for doing it – like tsunamis and earthquakes, and high crime rates – but it’s a city with a lot of political demos and many people weren’t happy and asked, ‘Have you thought about the privacy ramifications?’”

The city agreed to set up a privacy commission, to draft laws to govern the technology it could use – with citizens playing a part to ensure the tech used was proportionate and accountable.

Harkness, who was addressing the issue, Does Big Data Mean Big Brother?, concluded: “You don’t have to turn your back on tech but you can keep it under democratic control.”

Build trust by getting people to work together

Prof Sandy Pentland says individuals need control over their data

A leading data scientist told the conference that one way to build “data trust” was for personal information to be stored, anonymously, in data co-operatives.

Professor Sandy Pentland from Massachusetts Institute of Technology said that data co-ops could be established by specific communities to aggregate anonymous data to make better policy decisions. This could include designing transport networks or taking action to tackle poverty and inequality.

He said: “The core problem is that a small number of players [tech and social media corporations] control an enormous amount of data. People don’t trust what’s being done with it.

“The general feeling is that this new environment is not acting in the interests of citizens, but big players –

Hide Ad
Hide Ad

so what are we going to do about that? People want control of data, and trust in what it is used for.”

Pentland argued data was now “a primary means of production”. He said a new deal on data was needed to re-balance the relationship and give more power to individuals to control how their data was stored and what it was used for, like the arrival of trade unions and then credit unions had re-balanced the relationship around labour and capital.

“Data is currently too concentrated in a few hands,” he said. “Normally, we would break up the monopoly, but that doesn’t work in this case. If you break up Facebook, you would just get two Facebooks.”

Pentland suggested a new structure could be created around three key pillars – a focus on digital identity and ownership rights, accountability, and penalties for not following the rules.

He stressed data co-ops would not own the data, just hold it: “A data co-op is like a store, but it can also help a community use that data. It gives people the chance to come together and get the sort of things they want.”

This could include looking at how to deliver better education, employment or health outcomes. The building block for data co-ops could be census data, which gives details on specific neighbourhoods.

“We have safe data, aggregated in the correct way, then can we add the flows? Where do people work, shop and go out? How much do they spend? Who comes into your neighbourhood for work? It’s not individual data, it’s aggregated together.

“You could use data from your telephone companies, your apps [to feed in]. You need a co-op in charge, not a government body, and you could do amazing things with all of this fine-grained data.”

Hide Ad
Hide Ad

Pentland said the data flows would be useful for public health, including during pandemics, as you could see where people had been, and how they interacted.

Asked whether it would be challenging to overcome long-standing fears about privacy, Pentland said: “You are not sharing personal data; you are storing it, aggregating it and drawing insights from it. The way to build trust around data is to get people working together.”

A message from the Editor:

Thank you for reading this article. We're more reliant on your support than ever as the shift in consumer habits brought about by coronavirus impacts our advertisers.

If you haven't already, please consider supporting our trusted, fact-checked journalism by taking out a digital subscription at https://www.scotsman.com/subscriptions.

Comments

 0 comments

Want to join the conversation? Please or to comment on this article.