Kate royal photo: Why media and all of us need to take responsibility for information creation amid AI concerns

Emily Asgari, a PhD researcher at the University of Edinburgh, writes about the concerns surrounding information and image manipulation in the wake of the royal Mother’s Day photo

As the self-designated photographer for my own family, I know on those occasions when I need to ask my significant other to get a snap of me, there is a significant amount of set-up, test-shot taking, and “here, this is how I want you to frame it, and be sure to get my shoes”.

In the end, my spouse is pretty much just pressing a button. With Princess Catherine being the avid photographer that she is and with Prince William behind the lens, I expect he received similarly detailed instructions in the production of the royal Mother’s Day family photo.

Hide Ad
Hide Ad

Despite all the preparation for the perfect shot, there is usually a single image that is almost perfect. So, as many of us do, Princess Catherine whipped out a handy editing tool, Adobe Photoshop, to make it just right.

The interest in the Mother's Day royal photo is about the changing roles and duties the public plays as not just consumers, but also producers, of news, writes Emily AsgariThe interest in the Mother's Day royal photo is about the changing roles and duties the public plays as not just consumers, but also producers, of news, writes Emily Asgari
The interest in the Mother's Day royal photo is about the changing roles and duties the public plays as not just consumers, but also producers, of news, writes Emily Asgari

But there was something about this rather relatable story, besides the speculation surrounding the Princess’s recent surgery, that sparked a lot of intense debate that has a lot to do with the new times of AI we are now living in.

What is fascinating about this story is not what editing was or was not done, but the resulting discussions about the use of artificial intelligence and photo-editing the photograph generated.

Edited or manipulated images are certainly not new. Perhaps most famously, Joseph Stalin weaponised photographic manipulation to smooth his pockmarked skin and methodically erase his enemies from history by removing them from photographic records.

And this was all done well before the invention of Photoshop. While once it may have been the case that cameras, let alone editing tools, were only accessible to professional photographers, the digital transformation of society made advanced cameras and editing tools easier to use and available to everyone at a relatively low cost.

This interest in this story has less to do with the royal family or the editing incident itself. It’s really about the evolving news infrastructure, the eroding of the press as the gatekeeper of information, and the changing roles and duties the public plays as not just consumers, but also producers, of news.

Historically news agencies would hire their own photo journalists or independent professional photographers that would have been bound by a mix of contractual obligations and social trust. But there has been a shift to a reliance on the user-generated content and contributions by members of the public – a symptom of all those phones on smartphones in everyone’s pockets.

Now all those members of the public have easy access to not just Photoshop, but generative AI tools. This means now there needs to be a shift in our shared understanding of our responsibilities, new ground rules, and a re-evaluation of who we can trust.

Hide Ad
Hide Ad

The law will have a role in how this shift plays out, and new regulatory proposals, like those in the EU AI Act, may make disclosure of the use of some types of manipulation mandatory to ensure they become more readily identifiable.

What we should take away from this story is a recognition of a need for established media to re-evaluate their relationship with content produced by the general public and to the general public, and a need for reflection by all of us on how we all now share responsibility in the creation and maintenance of a trustworthy information eco-system.

- Emily Asgari is a PhD researcher at Edinburgh Law School, University of Edinburgh, whose research subject areas include AI, intellectual property, free speech, and the law and technology. She is also a licensed attorney from California specialising in intellectual property law.

Comments

 0 comments

Want to join the conversation? Please or to comment on this article.