Photographers and artists have been fudging, faking, manipulating, and tinkering with photographs since the invention of photography.
Take Camille Silvy's image titled The Streets of London which was made in 1859, only a few decades after the invention of photography and 132 years before Adobe Photoshop 1.0 was released, and was made by combining four separate negatives.
While it required more skill than writing a prompt for a generative AI engine, the final image is as fake as anything that might have been altered in DALL•E 3.0. If Camille Silvy was entering his photo (montage) into a 'Photographer of the Year' competition, he might have gotten into a spot of bother.
A little over a century after Silvy's meddling, AI imagery are the hot topic du jour in the world of photo competitions. There are plenty of stories of AI-generated images fooling judges and winning awards, and it’s easy to see how, as some of the results can be utterly convincing.
German artist Boris Eldagsen revealed he won the Sony World Photography award with an AI-generated image (see below). He pulled the wool over various eyes – and made a bit of a scene about it at the awards night.
Many photo competitions have since updated their terms and conditions, and reevaluated their stance on where AI sits in the context of their agenda and, more importantly, where to draw the line.
How photo competitions are adapting to AI
For most competitions, the line sits somewhere around the acceptable use of some AI tools to modify an image in editing software such as Adobe Photoshop, typically with some caveat that you should stipulate this is the application process.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Many competitions will ask to see raw files before making a final decision. I'm a former creative director of Landscape Photographer of the Year, and we would do this to ensure that images hadn't been manipulated too much.
This sentiment was echoed by Dan Calder of Close Up Photographer of the Year, who says, “Close-up Photographer of the Year is first and foremost a photography competition, so we won't be accepting any images made with generative AI. During the latter rounds of judging, we will ask for raw files. And each entrant will confirm a declaration that no generative AI has been used”.
On the subject of image editing using AI tools, Calder adds, “We are now open to creative post-processing techniques as long as we are informed about what has been done and that natural behavior is not misrepresented. We will continue to draw the line at AI-generated objects, animals, flora, landscape motifs that are added to the original picture”.
On a final note, Calder concludes that “over time, if photographers wish to start incorporating AI-generated elements to their pictures, then we will follow their lead and look to create a new category for them”.
Hugo Donnithorne-Tait, Awards Director of the British Photography Awards, has some interesting thoughts, too and says, ”I'm convinced AI will be a watershed moment for so many industries and the great mega-trend of our generation – that said, there are important things to protect from it. Image integrity is one of those things, especially if you run a purist photography award as we do."
"We built a secondary upload function so those who are considered for shortlisting can upload an original file or raw format file side by side with their image. While this used only to be requested when we suspected doctoring of imagery, we are going to implement it for all imagery due to AI. This will help us police the imagery and maintain the integrity of our shortlist."
"In the future, hopefully, a marriage of camera advances and blockchain technology can help us verify individual instances, creating an immutable catalogue of when moments were captured. Samsung has led the way with this to some extent. Their new smartphone (the Samsung Galaxy 24 Ultra) does have AI image editing built in, but it also has a function that 'tags' doctored images in the metadata. This is the kind of considered advancement we need, simple systems that let us see what is what”.
Authenticity is at stake
For the World Press Association (WPA), and photojournalism in general, integrity and authenticity hold a higher stake. I asked Andrew Davies from the WPA about its stance on AI, and he told me, “As a starting point, we do not view AI-generated images as photographs. They can simulate the look of a photograph, but by definition, it's not the same. A photograph captures light on a sensor or film; it is a record of a physical moment. So, by definition, this is different from an image made with generative AI. Therefore, as a photography competition, we do not allow AI-generated images”.
Davies goes on to explain how the WPA approach the prevalence of AI. “We use a multilayered approach to spot fake or manipulated images: We verify the professionality of all entrants and the facts behind each story. The images are judged by industry professionals, and the original files are examined by two independent digital analysts.”
He adds, “I think the bigger issue is around trust in general. Scenes can be staged, the framing of images can be deceptive, photo manipulation is almost as old as photography itself, and digital editing software has been around for decades. Our contest is outside the 24/7 media cycle, and we do not have to work at the speed of social media. We can take time to check each finalist image carefully”.
For Davies, the bigger question is how people can continue to trust press and documentary photography in general, outside the realm of photo competitions. Davies explains that there are some technological solutions for tracking the origin of images, but these can’t tell you about the trustworthiness of the source. The WPA helped co-create a set of principles, compatible with those followed by many news organizations, that individuals and organizations can also adopt or adapt.
Reevaluating what's acceptable
It's clear from my conversations that there's currently a lot of fuss, debate and anxiety around the evolving nature of photography competitions in the new (ish) era of AI. After speaking with a number of competition organizers the consensus among them seems to be that while AI tools present new creative possibilities, safeguarding the integrity and authenticity of images remains crucial.
From established competitions like the British Photography Awards to revered institutions like the World Press Association, measures are being implemented to ensure transparency and uphold the essence of photography. As AI continues to shape the future of image making and photography, truth and integrity will remain the guiding principle for both creators and organizers alike, but, really, this has always been the case.
At the end of the day, a purely AI-generated image is not a photograph and therefore has no place in a photography competition; a photograph that's been enhanced using AI tools might, but exactly what is acceptable depends on the competition and its core values, just as it's always been.
You might also like
Benedict Brain is a UK-based photographer, award-winning journalist and author. He balances his personal practice with writing about photography and running photography workshops and enrichment programmes. He writes a monthly column called The Art of Seeing, and his first book, You Will Be Able To Take Great Photos By The End of This Book, was published in 2023 by Ilex Press in the UK and by Prestel in the USA with translations in Spanish, Bulgarian and German; his second book, A Camera Bag Companion, was published in March 2024. Benedict is often seen on the panels of prestigious photo competitions, and in 2020, he founded Potato Photographer of the Year. Benedict exhibits his work internationally, and travels the world as a public speaker, talking about the art and craft of photography.