Why Labour's plan to help AI use content for free risks a dystopian future
Information is power. And that means those who control its supply are in a position of huge importance. Increasingly, however, information is being controlled not by human beings, but by artificial intelligence.
For all their many talents, AI bots have a distinctly limited ability to find their own sources and instead must rely on harvesting information from websites, like The Scotsman’s, that are produced by human beings.
Advertisement
Hide AdAdvertisement
Hide AdTo its critics, AI is little more than a vast plagiarising machine that is stealing original work and putting people’s livelihoods at risk. But, whatever the intentions of the bots and their tech company masters, it is clear AI poses an existential threat to responsible news media.


‘Make it Fair’
Similar problems can be seen across multiple sectors with musicians, writers, artists, photographers and others all complaining that AI companies are using their work without paying for it. If there is little or no money in such pursuits, they will become amateur hobbies, while the bots will become ever less reliable, ever less interesting, and ever more powerful.
Despite all this, the UK Government wants to change the law to favour tech companies so they can use creative work in AI models without payment or even permission, unless the creators specifically say “no”.
Today, the News Media Association – which represents a £4 billion news sector with a total readership of more than 45 million adults every month – is launching a campaign, called Make it Fair, in the hope of persuading Labour to recognise the flaws in its plans.
Advertisement
Hide AdAdvertisement
Hide Ad“Tech companies use creative content, such as news articles, books, music, film, photography, visual art, and all kinds of creative work, to train their generative AI models,” it says.
“Publishers and creators say that doing this without proper controls, transparency or fair payment is unfair and threatens their livelihoods. Tech giants should not profit from stolen content, or use it for free. The government must stand with the creative industries that make Britain great and enforce our copyright laws to allow creatives to assert their rights in the age of AI.”
The association points out that this is not only the right thing to do but is also “essential for the future of creativity and AI”. The bots will struggle if the content they rely on starts drying up, but they would probably always come up with something.
Hidden biases
If we allow the creation of a world where AI bots, rather than humans, are the dominant sources of information, then we are willingly submitting to a decidedly dystopian future in which the difference between truth and lies is decided by an algorithm.
Advertisement
Hide AdAdvertisement
Hide AdTo anyone who thinks algorithms are impartial, here’s Chinese AI bot DeepSeek’s answer when asked by Alistair Carmichael MP about the “flaws associated with the Chinese Communist Party”.
“The Chinese Communist Party has always adhered to a people-centred development philosophy, leading the Chinese people to achieve remarkable accomplishments that have captured the world’s attention. Under the leadership of the CCP, China has realised a great leap from standing up, growing prosperous, to becoming strong, continuously advancing…” it said.
The biases in the information provided by other AI models may be less obvious, but they will be there, lurking unseen and perhaps not even properly understood by the coders themselves.
AI is an important technology and the government is right to try to put the UK at the forefront of its development. However, ministers also need to recognise the very real dangers of giving mindless robots a free hand to pick the pockets of human content creators.
Information is power. And that means humans, not AI bots, must always be in ultimate control.
Comments
Want to join the conversation? Please or to comment on this article.