Data Capital: renowned author Stephanie Hare on identifying risks at the cutting-edge of state surveillance

David Lee meets technology, politics and history researcher, broadcaster and author Stephanie Hare – known as the ‘human Swiss Army Knife’, due to her comprehensive skill set – about the way government uses our data and the dangers of complacency
Stephanie Hare: “I wanted to look at some of my concerns around facial recognition technology and how it has been rolled out in ways that are unaccountable democratically” Picture – Adobe.Stephanie Hare: “I wanted to look at some of my concerns around facial recognition technology and how it has been rolled out in ways that are unaccountable democratically” Picture – Adobe.
Stephanie Hare: “I wanted to look at some of my concerns around facial recognition technology and how it has been rolled out in ways that are unaccountable democratically” Picture – Adobe.

We Begin As Data is the title of a chapter in Stephanie Hare’s book, Technology is Not Neutral. It explores the idea that from our very DNA, human beings are pieces of information.

“At the most basic level, I’m talking about our own genetic code,” says Hare. “How can I prove I am my parent’s child? It isn’t just that I look identical to my father. It’s about taking a blood sample from me and my dad, running a test and seeing the DNA match.”

Hare, an American who lives in London, says this goes much further: “Our fingerprints and voices, our face, our eyes, the way we walk – everything can be turned into code.”

Tech, politics and history researcher Stephanie Hare on how government uses our data and the dangers of complacency. Picture – supplied.Tech, politics and history researcher Stephanie Hare on how government uses our data and the dangers of complacency. Picture – supplied.
Tech, politics and history researcher Stephanie Hare on how government uses our data and the dangers of complacency. Picture – supplied.

She extends this through to our online selves: “All our photos, chat, emails, everything has been turned into code.”

The way data is collected and used, and where the power in this data value exchange lies, fascinates and appals Hare and was a springboard for Technology is Not Neutral.

“When I became an independent researcher, I wanted to look at some of my concerns – particularly around facial recognition technology – and how it has been rolled out in ways that are unaccountable democratically,” she explains.

“British people haven’t really been given a say in this; it’s been imposed upon them. Parliament has done nothing to legislate on this. Yet my research colleagues here and in the US have shown how inaccurate this technology is. So you’ve got law enforcement bodies like the Metropolitan Police [and other forces] using a tool that researchers say doesn’t work well on people with darker skin, on women, older people, trans people. It works really well on white men – but that’s not good enough in a democracy.”

Facial recognition tech, and how it has been rolled out in ways that are unaccountable democratically. Picture – supplied.Facial recognition tech, and how it has been rolled out in ways that are unaccountable democratically. Picture – supplied.
Facial recognition tech, and how it has been rolled out in ways that are unaccountable democratically. Picture – supplied.

This opened up some fundamental questions for Hare. Was it about making facial recognition tools more accurate, by collecting more data and refining algorithms? Or was there something much deeper about the limits of privacy?

“Let’s imagine a world where you can accurately identify everyone the minute they step outside their house,” she says. “You’d be safe from surveillance in your house, but what about your car, your church, your psychotherapist’s office, your AA meeting? What about protected vulnerable populations? How does this work for kids?”

Hare’s thought process is grounded in her research past: “I take problems and flip them, I look at them and open them up, take them apart, rebuild them. By the time I’m done, I have a book, an essay, or a presentation.”

With facial recognition, she started writing articles and “sending up balloons” to get responses. She met with civil liberties organisations, other researchers and the Met Police.

“Of course I too want to stop crime and make sure we don’t have terror attacks – but the Met is already on special measures for its behaviour, and I don’t think the way to rebuild lost trust is to use facial recognition tools that discriminate. We had to agree to disagree.”

In addition, Hare spoke to the Royal Society and the All-Party Parliamentary Group on AI at Westminster, but also went into schools as part of her “building from the ground up” research.

She says this approach can be very different to how businesses work: “When some corporations do research, what they’re often doing is reinforcing their marketing messages. For example, if the message is ‘we need AI embedded across the entire organisation to generate value’, they will design research to support that. I’m more about seeing problems, tensions, paradoxes, and asking ‘Why?’”

Hare was well down the road with Technology is Not Neutral when the pandemic intervened. Big government initiatives such as the Covid app reinforced her concerns about how the state uses our data.

“Matt Hancock, then UK health secretary, said it was our duty to use the app. Duty for me is a trigger word – my doctorate examined the career of a French civil servant who tried, and failed, to justify his crimes against humanity in the Second World War by citing the ‘duty to obey’ the government. His trial raised provocative questions about the duty to obey vs the duty to disobey.”

Hare explores big questions around who has power over technology and data, how power manifests itself – and the big question of what is neutral.

“What’s been amazing is seeing the moment when a group of people gets it. We do a theoretical exercise of what a completely neutral technology might look like and I ask lots of questions. What electricity is being used to power it? What minerals have been mined to create it? What about the huge amounts of water used to cool centres where generative AI is answering our questions, and the enormous quantities of energy required to run these models?

“Since last November, the media has been going crazy about generative AI as the new hot tool, but it is literally hot – very energy-intensive when we already have a climate crisis! When new technologies come along, why aren’t we passing them through these lenses first?”

As a researcher, Hare will always ask these questions, but her approach isn’t always welcomed in tech circles.

“There is sometimes a sense that to work in tech, you have to have studied maths, physics, computing, or other hard sciences at a high level. If you come into tech, like me, with a humanities background and call yourself a tech researcher, you can feel like an interloper.

“Sometimes, I feel I have to do much more to demonstrate my value; I can’t simply say I did computer science at MIT or Stanford. So I focus on saying that ‘I will partner really well with you and I’m going to make your life easier’. I can bring stuff that you can’t see to this work, and it will be better for it. Not everybody is convinced right away, so you have to be gritty, have perseverance and not take it personally.”

What different perspectives do you bring?

“One is thinking about how to get a group of users to use a specific tool – to see the world from where these people are, yet also understand how the engineers want to approach it, and then build a bridge.

“To do that, you need cultural sensitivity, to be willing to put the time in and build relationships and trust. Are the end-users utilising the tool in the way you assumed?

“If you’re designing something for a hospital, or occupational therapist – in other words, not someone in a beautiful tech design room – you’ve got five-to-ten seconds for them to decide if they’re going to adopt this new tool or if they’re happier using the old thing that isn’t very good but familiar. Getting people to change is hard.”

So who is Stephanie Hare? Researcher, historian, writer, broadcaster, technologist? Or all of the above?

“I never know what to say when people ask ‘What’s your core function?’ The best response I’ve come up with is ‘What is a Swiss Army Knife’s core function?’ It’s a tool you can adapt or change for specific purposes.

“So I guess I try to work like a human Swiss Army Knife and to learn like a polymath rather than specialise in one function. Why only have one core function? Why limit yourself?”

Among her many functions, Hare is proud to call herself a researcher. “To me, a researcher is one of the most powerful people because they’re asking questions, and finding answers. Then they share them, and submit that to peer review, which takes courage because you will get things wrong and have your arguments taken apart or even rejected.

“To be a good researcher, you have to be fearless – and really responsible and ethical, because research can shape other people’s thinking and decisions.”

And, Hare says, researchers must be able to challenge accepted norms. “I’m sensitive to and concerned about protecting civil liberties and human rights in liberal democracies. I think most people assume that that is baked in and it’s fine. I know it is not.”

Stephanie Hare will speak on Technology Is Not Neutral and join a panel discussion on AI at The DDI/Scotsman Conference – DataFutures, AIFutures – on Wednesday, 27 September at the Royal College of Physicians of Edinburgh. For tickets, go to the Scotsman Data Conference 2023 website.

Related topics: