Journalism is battling for space and trust in an information ecosystem polluted by misinformation.
The blame can be shared widely. There are cynical state actors, unscrupulous politicians, unregulated technology companies, pranksters, bad journalists and careless readers, among others. A new category of billionaire super-spreader has also recently emerged.
The statutes that govern AFP demand our journalism is accurate, impartial and trustworthy. In short, we must always strive to tell the truth. The drafters of the 1950s-era text could never have imagined the agency would now employ 140 journalists working entirely on lies and manipulation. Such a scenario would have been unimaginable just a decade ago when I was appointed editor-in-chief. But this where we are in 2023.
When our journalists report now, we know there will be people looking to undermine and manipulate our work. The pressure to get it 100 percent right has never been higher because we know that any slip or error, no matter how innocent, will be ruthlessly exploited. The stakes are raised because the counter-attack is often looming in the mind of the journalist as they report the story.
When three AFP journalists stumbled upon a line of crumpled bodies strewn along Yablonska Street in the Ukrainian town of Bucha on April 2 last year, they knew they were seeing something of immense significance. They knew they were witnessing the aftermath of a possible war crime. And they knew that every word and image would be picked apart in an attempt to discredit their findings. Reporter Danny Kemp found himself repeatedly walking up and down the road counting the bodies, fearful he would miss one or miscount. In the end he took pictures of each body on his phone just to be certain before filing his first news flash. He also wanted to be 100 percent sure how each body looked, which ones had their hands tied behind their back, what type of clothes each was wearing and where were the bullet holes. He needed hard evidence and metadata to prove he was not a liar.
And sure enough the counter-attack came quickly. Posts on social networks from the Russian authorities claimed the Bucha scene was staged. They produced slowed-down video to make it look like two of the bodies were moving. The motion was in fact the effect of a raindrop on the wing mirror of a vehicle from which the video was filmed. Satellite images proved the bodies had been lying on the streets for weeks and dated back to the Russian occupation of the town. Our digital verification team in the office was able to combine with our team on the ground to comprehensively debunk the claims. We published a fact-check, added context to our main Bucha stories and wrote an analysis about Russia’s attempts to distort the narrative.
Bucha was relatively easy to debunk because our journalists were direct witnesses, but we should not kid ourselves that this kind of crude manipulation does not find a sizeable audience. In France, for example, we saw it strike a chord with the large population that has turned its back on mainstream media such as the anti-vaccine communities. But we also saw it gain traction in parts of Asia, Africa and Latin America where the view of Russia’s actions in Ukraine is more benign and where Kremlin disinformation campaigns have sown the seeds of doubt and division over many years. The reality for us now is that large parts of the planet simply do not believe in our journalism regardless of our transparency, sourcing and devotion to facts. The lack of trust in our work is our biggest challenge.
At AFP we felt the first ripple of the current misinformation wave around 2012 when unsourced and unvetted video content from the Syrian conflict began filtering into mainstream media. We were obliged to create a team of digital sleuths to sort fact from fiction. We started to better understand how very crude manipulation of images, including our own, could have a devastating impact on victims and on public narrative. The smear campaign against the White Helmets, the rescuers who pulled people from bombed buildings, was a powerful example. The coverage of the chemical weapons attacks on rebel areas was another example where lies and misinformation could sow enough doubt or confusion to stall concrete action.
But it was in 2016 that the “fake news” debate exploded into public consciousness with the election of Rodrigo Duterte in the Philippines and Donald Trump in the United States on the back of cynical disinformation campaigns on Facebook. It was also the year of the Brexit referendum in the United Kingdom. Since then it has felt as if misinformation has played a significant role in virtually every major story we have covered. During the pandemic years, there were clearly many moments when the battle of Truth vs Lies or Fact vs Opinion was the only story in town.
AFP and many other media responded robustly to the shock of 2016. There was a doubling down on the fundamentals of impartial journalism, the creation of fact-checking teams and disinformation correspondents, and much greater attention to media literacy campaigns. The public is much better informed, but the public remains distrustful. AFP and many Philippine media reported deeply on the massive disinformation smokescreen deployed around last year’s presidential campaign of Ferdinand Marcos Jr to wipe out the historical crimes of his family. Despite much excellent journalism, Marcos was elected by a landslide.
The challenges continue to pile up. For some years now experts have been warning about the lethal potential of “deep fake” synthetic media. We have seen Twitter — a vital journalistic tool where so many individuals, companies and institutions communicate – junk its content moderation policies with predictable results. And now we have the seductive power of generative AI. We know that tools such as Chat GPT are not programmed to have an ethical commitment to tell the truth. They can be used by bad actors to generate all kinds of fake information at scale. They can also be used by well-meaning but careless users to spread errors and confusion. Another real concern is the roll-out of AI text-to-image technology. These tools will not aid journalism’s ongoing battle to win back trust. As the 2022 Reuters Digital News Report noted, only around four in 10 people globally have broad confidence in our work. In these circumstances, it is not overblown to ask the question: is fact-driven journalism facing an existential threat?