Data conference: Sampling expert opinion on future tech moves
Dr Michael Salter, from the leadership team at Childlight, the Global Child Safety Institute – launched earlier this year to tackle child sexual abuse and exploitation – told the event’s Safer Futures panel that technology had in many ways made his work much more complicated.
He explained: “The structures of the internet over the last 20 years have resulted in dramatic expansion of the scale of child sexual exploitation.
Advertisement
Hide AdAdvertisement
Hide Ad“To realise the goal of reducing child sexual abuse and exploitation, we need to co-ordinate activities and be aware where we get most impact in terms of preventative interventions.
“That requires data to [understand] the determinants of child sexual abuse – social, economic, political and technological –and what levers are available to reduce its overall prevalence.
This involves identifying what data is available and how to access it. “Law enforcement and criminal justice data, for example, is extremely sensitive legally,” Salter said. “So are there innovative ways we can partner with law enforcement to work with that data within legal and regulatory constraints?
“The technology sector has vast amounts of data on child sexual abuse, but don’t particularly want to share because it casts their service in a negative light. How do we work with the data available to understand what’s going on online? How do we gather and map data, fill data gaps and make an impact? It’s one thing to have data resources available, it’s another to leverage them for real change.”
Advertisement
Hide AdAdvertisement
Hide AdSalter said there was optimism around the potential of AI, specifically large language models, to help catalogue and analyse material, including instances of child sexual abuse. This raised ethical questions around the legality of collecting material and “controversial technology products” to identify children, which could help identify perpetrators, he said.
Alex Hutchison, director of the Data for Children Collaborative (DCC), described how his specialised unit runs data and data science projects designed to improve children’s outcomes by partnering with a variety of organisations: “We ask, ‘What are the challenges you want to try to solve that might have a data-driven solution? Where do you not understand what data to use? Where do you lack expertise to use data science to deliver improved outcomes for children?”
DCC builds teams across academia, the public, private, and third sectors, to try to answer these challenges, with a focus on “the risks associated with data and the extra vulnerabilities related to children”.
Hutchison highlighted two projects, one on Covid 19’s impact on children’s access to sport in Scotland, and another on crop yields, climate, and child malnutrition in sub-Saharan Africa and Bangladesh.
Advertisement
Hide AdAdvertisement
Hide AdLisa Farrell, from The National Robotarium, highlighted how using robotics with data and AI, could help older people live safer futures for longer.
A humanoid robot which mimics Parkinson’s disease tremors accurately is being used to create a computer training model to help diagnose the condition more effectively and earlier, Farrell reported, while conversational robots are being utilised to aid dementia patients.
She stressed: “We’re not looking to replace humans with algorithms or robots. It’s about assisting humans to give them as much information as possible to make the best decisions for the patients.”
The ethics of using data is a challenge in human-robot interaction, but people tended to be happy to share personal data “for the greater good”, Farrell said.
Advertisement
Hide AdAdvertisement
Hide AdAlex Reissig told the panel she and her co-founder started digital solutions firm Smplicare when her older relatives began to experience falls. It uses wearable technology to collect a range of data on heart rate, sleep patterns, breathing, hydration, and more, to help predict when people are more at risk of falls, which lead to 220,000 visits to A&E annually. “It’s about following behaviour day-in/day-out,” she explained.
Reissig said some people were concerned about using personal health data from vulnerable adults, but stressed that there were “fantastic standards for how we can use data responsibly”.
She added: “Although this is a commercial research study, we decided to go through ethics approval as if it were a clinical trial, because we are stewards of very important data.”
There were also issues of older people distrusting technology, Reissig admitted. However, she went on to say that working with partners – such as housing associations, for example – to explain why the data is being collected could go a long way to reduce people’s concerns.