In April 2023, photojournalist Michael Christopher Brown posted illustrations from his 90 Miles experiment on Instagram. The illustrations, made with generative AI imagery, appear to show the realities of Cuban life that have motivated citizens to cross 90 miles (145 kilometres) of ocean from Havana to Florida.

Brown’s photojournalism pedigree is impressive. His work has appeared in publications like National Geographic, The New York Times Magazine and Time.

But this experiment met with plenty of criticism. These were some of the comments on the posts:

Photography has lost all credibility. Who is ever going to believe what they see anymore?

I respect your work in Cuba and have followed it for a while. This work makes me very nauseous and I honestly can’t explain all the reasons why.

These are terrible. I’m embarrassed for you.

While Brown declined an interview, the project website indicates that Brown kept a running list of subjects he wanted to photograph but couldn’t, often because access was impossible. While Brown worked on photography projects in Cuba from 2014-16, the list grew to include Cubans fleeing to the United States by water. Generative AI technology made it possible to give life to some of the images on his list.

“I think that kind of experimentation with AI was bound to happen,” said Jesse Winter, an award-winning photojournalist whose words and images have been featured in publications like The Guardian, The Narwhal and The Globe and Mail.

“Good on him for being willing to experiment,” said the Vancouver-based photographer. “But the reaction to that I think was appropriate, which is that this is bad…this is like a weird look through the looking glass warping of reality.”

Depicting truth through images is the most important part of a photojournalist’s job. Now, generative artificial intelligence (AI) technology is upending the craft.

Generative AI tools like OpenAI’s DALL-E and Midjourney create imagery in response to typed prompts. They’re fun to play with but, if used to replace the work of photojournalists, could contribute to the ongoing decline in Canadians’ trust in news. Alternatively, there is a risk that audiences could think that AI images are real.

Understanding ethics: What are the ethics of photojournalism?

Darren Calabrese, a Halifax photojournalist and the author of Leaving Good Things Behind: Photographs of Atlantic Canada, describes photojournalists as “visual communicators of the truth.”

Calabrese’s work has appeared in The Globe and Mail, Maclean’s and the National Post, to name a few.

Not only do photojournalists understand the textbook elements of photography, they must also think about the ethical implications of their photos.

The Canadian Press Stylebook is a guide followed by a wide range of Canadian news organizations. It prohibits any photo editing beyond the basics of what can be done in a darkroom.

“I don’t want to be associated with them. I can’t.”

Darren Calabrese, Freelance Photojournalist

“Simple burning (making light portions of a print darker), dodging (making dark portions lighter), colour balancing, toning, and cropping are acceptable,” states the stylebook. “Exaggerated use of these features to add, remove or give prominence to details in the photo is not.”

Winter said photojournalists represent subjects accurately, ensure that their images aren’t exploitative or harmful and don’t stage photos.

What makes AI imagery a problem in journalism is that it’s not real, said Winter. He compares it to painting a picture of a location you’ve never been to, then claiming it’s somehow a reflection of the actual location.

“Our job is not to reflect what people already think about the world,” said Winter. “It’s to reflect new and changing perspectives of the world.”

So… how does AI work exactly?

Mark Daley became Western University’s first-ever chief AI officer in October 2023.

Daley said producing an AI-generated image begins with random noise. Picture an old analog television screen that has a snow pattern on it. When you enter your text prompt, you are applying constraints to the random noise, which creates the image.

The technology creates what Daley describes as an “image de novo,” the same way an artist would. If an artist wanted to paint a picture of a dog, they would use their basic understanding of what characteristics dogs possess. Similarly, AI imagery technology relies on information, or what Daley calls “stored statistics,” to generate an image.

Early iterations of generative AI imagery tools such as Stable Diffusion and Midjourney weren’t very good when they first launched, said Daley, but they soon drastically improved.

“The improvement and level of refinement with newer and newer models has been amazing,” said Daley, “and so, it just gets better and better at deciding how close does this image look to this text.”

‘I don’t even want to know’

When asked about AI imagery tools, Calabrese said, “I don’t even want to know.”

“I don’t want to be associated with them. I can’t.”

If AI imagery is used to accompany journalistic writing, some audiences might start to doubt if the photos are real. But others might not notice the difference at all.

Without it being clear to the reader that a human was there to witness and photograph the image, said Calabrese, it’s easier for someone to argue that the photo is untrue.

As Winter said, this is the value photojournalists bring to the table – they document real stories in real time.

“Without trying to sound overly dramatic or hyperbolic,” said Calabrese, “those imagery machines have begun a really scary thing by eroding our trust and understanding of what a photograph is.”

Erosion of trust and media literacy

The 2023 Reuters Institute Digital News Report said that trust in news overall is falling among English-speaking Canadians.

The report indicates 40 per cent of Canadians said they trusted “most news, most of the time.” That number was down 15 percentage points from 2016, when 55 per cent of Canadians indicated they trusted “most news, most of the time.”

The report also found people are still concerned about identifying what’s real, and what isn’t, online. Those worried about fake digital news increased by two percentage points from last year, now sitting at 56 per cent.

Erosion of trust means photojournalists are more important than ever now, said Daley.

“In a sea of deep fakes and fake media, you have a small cohort of people whose entire livelihood and professional ethics is ‘I was in this place, and I took this picture, and this really happened.’”

Newsrooms address AI

Some news organizations, such as the Toronto Star and The Globe and Mail, have communicated directly to their readers about AI and journalism. 

In a column in March 2023, the Toronto Star’s public editor, Donovan Vincent, published an opinion piece on the subject. The Globe and Mail published a similar note a few months later.

Both outlets said AI imagery would not be used for news stories. The Star said new technology might come in handy down the road for feature-type stories. The Globe said AI tools may be used for feature images; however, in such instances, the photos would be appropriately labelled “AI-generated image” or “AI-generated illustration.”

Matt Frehner, who issued The Globe’s note, is the publication’s head of visual journalism.

“The contract we have with our readers when they subscribe,” said Frehner, “is what they’re consuming is true.”

Their team recently published an AI quiz, asking readers to spot the image or text they believe was generated by AI.

“Part of our job is to educate people on every topic, right? And so, with something that’s as new as AI, yeah, you can’t expect people who aren’t conversant in the tools and technology to understand what’s possible,” said Frehner.

Journalism has a crisis of trust right now, adds Winter. And yes, AI is going to make it worse.

“There was a time when the expectation was that if you were a journalist, we didn’t have to argue for why we deserved trust. We just sort of enjoyed it. And I think that time is over.”

Future photojournalists and photojournalism’s future

There aren’t many newsrooms that employ full-time photojournalists. Most rely on freelancers and handout photos.

“If you look at all your major publications, they’re down to just a handful of photojournalists, if that,” said Randy Kitt, director of media for Unifor, a Canadian union that represents workers at the Toronto Star, SaltWire and The Globe and Mail.

“It’s precarious work, and even our most prestigious, award-winning, amazing, brilliant photojournalists in Canada are struggling to make a living in photojournalism because of the gig economy and the financial pressures put on our publishers.”

Calabrese hopes the conversation around AI imagery and photography spurs newsroom managers to understand the value of using humans to share the stories they publish.

Generative AI will shape the careers of future photojournalists, too.

Winter, who teaches photojournalism at Langara College in Vancouver, said rather than warning people away from using generative AI imagery tools, his approach is to essentially evangelize photojournalism as a storytelling tool that is critical for news outlets.

“If they understand all of that, and they’ve heard my sermon and they agree with me, then I sort of don’t have to worry about AI, because they will view it the same way I do, which is that it’s a non-starter,” Winter said. “It’s not real. It’s not what we do.”

Share this

About the author

Have a story idea?