Conference report: Taking gender pay gap to task

Data has the power to change the world for the better by “cutting through platitudes” and holding organisations to account for their public statements on diversity and equality.

Francesca Lawson and Ali Fensome, of PayGapApp, told the conference how they shone a light on the gender pay gap on International Women’s Day (IWD) on 8 March this year.

The pair designed the PayGapApp bot, which identified key phrases around IWD on the Twitter platform on that day – and then created an automatic tweet, using publicly-available data, to show the tweeting organisation’s gender pay gap.

Hide Ad
Hide Ad

“It allowed members of the public to see the truth and decide what they think – whether organisations were being honest and taking [genuine] responsibility for equality,” said Lawson.

Francesca Lawson and Ali Fensom. Image: Lisa FergusonFrancesca Lawson and Ali Fensom. Image: Lisa Ferguson
Francesca Lawson and Ali Fensom. Image: Lisa Ferguson

“One repeated statement on IWD was ‘Here’s to strong women. May you know them, be them, and raise them’”, said Francesca.

“It’s a nice sentiment, but when it’s the only thing you’re hearing, it obscures the real picture. It’s just words which don’t reflect the reality we live in.”

That “reality” is that almost 80 per cent of organisations reported that women’s median [average] hourly pay was less than men’s in 2020-21. All organisations employing 250 or more people have to report their gender pay gap, but the reality revealed by PayGapApp was “news to a lot of people,” said Fensome, who designed the technology, with Lawson concentrating on the text.

“[The information] was on the UK Government website, but not many people were looking at it,” Fensome told the event. “The data was publicly available, but not in the public eye.

“We challenged those warm, fuzzy sentiments to make people sit up and look at their own data and pay gap. Data cuts straight through the cut-and-paste platitudes – all too often, employers’ attitudes did not match their words.”

The PayGapApp Twitter account received 135 million impressions in one month around International Women’s Day, with high-profile endorsements from people like writer Caitlin Moran, who described it as “deadly genius”.

Lawson said she thought the app had been so successful because of its simplicity.

Hide Ad
Hide Ad

“There is no emotion in it, we are just churning out the data. People react and attach their own emotions to it and that’s when it becomes a story. We were not trying to shame companies or get them to stop communicating, It was about transparency, accountability and responsibility for employers, about using real data to inform their actions.”

So what next? “It’s about trying to keep the data alive,” said Fensome. “There are many other inequalities we can’t see based on the government data, and we’d like to see improvements to the government reporting requirements.”

Lawson added that some of the organisations whose pay gaps were highlighted on IWD had come out fighting, while others – like English Heritage – had explained what practical steps they are taking to tackle their pay gaps.

She expressed the hope that organisations would genuinely try to make changes, but recognised it would be a couple of years before it could be seen if there has been a real impact made by their efforts.

Lawson and Fensome joined a panel discussion on using data (and artificial intelligence) to shine a light on inequality and bias. They were joined by Gavin Abercrombie, of Heriot-Watt University, who described his work on the use of feminised names and voices in conversational AI systems – such as Apple’s Siri or Amazon’s Alexa – and whether or not there was a connection between them, and sexist abuse online.

“When we analyse conversational systems’ data, we find that much of the language is sexist,” Abercrombie said. However, he stressed that it was not always easy to detect unacceptable language, or to agree what might – or might not – constitute online abuse.

Read More
Data Capital: Be secure in the knowledge on data protection

A forthcoming project of his will look at how to include a more diverse group, including victims of gender-based violence, in constructing data sets to better detect online abuse, and to mitigate that abuse.

One challenge identified by Abercrombie is that people creating systems are often “invisible workers”, with a lack of information about the diversity of those having an input. He also said it was difficult to access significant quantities ofreal-life conversational data, as organisations are often reluctant or unwilling to share it.

Hide Ad
Hide Ad

Lucy Havens, a PhD student at the University of Edinburgh, described her work on identifying and classifying bias in cultural heritage data – including hate speech and abusive language that would now be considered offensive.

She is working on a methodology, and labelling data sets manually, in order to develop an AI model to identify different types of gender bias in historical cultural catalogues.

The “messiness” of historic data was a challenge, said Havens: “It’s never as clean as you hope it would be, and that limits analysis.”

Related topics: