'Deepfakes' pose 'security risks' to Scottish Parliament that 'threaten trust in democracy'
Holyrood officials have been warned over potential “security risks that could threaten public trust in democracy” amid fears Scottish Parliament TV could be harnessed for AI deepfakes.
Researchers have concluded the “ramifications of ‘chamberfakes’ are potentially severe”, but have voiced confidence that Scottish Parliament officials can counter any threats.
Advertisement
Hide AdAdvertisement
Hide Ad

The Scottish Parliament commissioned researchers at the Scottish Centre for Crime & Justice Research and the University of Edinburgh to evaluate the potential threat of artificial intelligence (AI) technologies to parliamentary businesses and investigate how deepfake videos may impact the integrity of Parliament TV, which one participant dubbed ‘chamberfakes’.
The researchers are calling on the Scottish Parliament to put a formal process in place to respond to deepfake threats and allocate more staff to manage the risks of attacks.
Deepfakes are created when a video, image or audio file is altered using AI to distort someone's words or actions - and have become easier to create convincing versions due to advancing technology.
Advertisement
Hide AdAdvertisement
Hide AdSeveral MSPs, including former first minister Humza Yousaf and Greens politicians Maggie Chapman and Patrick Harvie, have been the subject of deepfake postings on social media.
The research team identified three main risks, including the video livestream being hacked, the dissemination of deepfakes on social media, and the many hours of parliamentary footage being used as feedstock to create harmful and abusive deepfakes of MSPs.
Co-author of the report, Dr Ben Collier from the Scottish Centre for Crime & Justice Research and the University of Edinburgh, said Holyrood choosing “to experiment with live streaming” has also “introduced new security risks that could threaten public trust in democracy”.
He said: “While the risk of a widespread attack is technically low, we have seen one-off examples of Scottish politicians falling victim to these kinds of attacks such as Maggie Chapman MSP, who was the subject of a deepfake audio clip posted on X that parodied chamber business.
Advertisement
Hide AdAdvertisement
Hide Ad“The ramifications of ‘chamberfakes’ are potentially severe, but we are optimistic about the Scottish Parliament’s ability to respond to these sorts of attacks, in large part due to the skilled broadcasting team who have technical processes and human checks already in place.”
Co-author Dr Morgan Currie said “at the moment the Scottish Parliament has no formal processes in place to respond to deepfake threats”.
Dr Currie added: “We recommend a formal intervention plan be put in place and reporting procedure for deepfakes and misinformation introduced and managed by a specific member of staff.
Advertisement
Hide AdAdvertisement
Hide Ad“Consideration should also be given to developing simple ways of authenticating witnesses and other contributors who dial in remotely to give evidence to parliamentary committees.”
Ms Chapman welcomed the research and that Holyrood officials were “taking this technology seriously”.


She said: "The video that was made of me may have looked like me and sounded like me, but it was clearly designed to mock me rather than to make people think it was real.
"The truth is that we are still in the early days of what could be a very dangerous and disruptive technology. We don't know what its capabilities will be in a few years’ time, but we do know that it could be used to spread distrust, promote and share malicious and purposefully misleading content and undermine our democracy."
Advertisement
Hide AdAdvertisement
Hide AdA Scottish Parliament spokesperson said: “As part of our risk management, we commissioned University of Edinburgh to explore the risks of deepfake technology to the Scottish Parliament’s video coverage.
“Its report shows we already have good measures in place. We’ll give careful thought to its wider recommendations.”
Comments
Want to join the conversation? Please or to comment on this article.