ChatGPT right to warn about 'perilous repercussions' of AI 'hallucinations' – Scotsman comment
Cambridge Dictionary's word of the year, hallucinate, is a reminder that one of a number of problems with artificial intelligence is that, sometimes, a bit like humans, it can ‘tell lies’ – or at least produce false information. AI hallucinations, aka confabulations, range from the nonsensical to, worryingly, the eminently believable.
The Scotsman decided to seek expert opinion on the subject, turning to AI program ChatGPT to write its version of our editorial comment. The resulting article, available at Scotsman.com, was good enough to raise the prospect of a future transformation into The Scotsbot, although there might have been a hint of AI-bias.
Advertisement
Hide AdAdvertisement
Hide Ad“Generative AI systems represent an incredible leap in technological prowess,” it cooed, before adding that the “marvel of these systems conceals an increasingly concerning issue: AI hallucinations”. The conclusion took it for granted that we live in an “increasingly AI-mediated world”, but ChatGPT did warn that failure to “mitigate the perilous repercussions of AI hallucinations” would risk “undermining the very fabric of truth and trust”.
We couldn’t have said it better? Oh dear. No! We definitely could have…
Comments
Want to join the conversation? Please or to comment on this article.