How hard is it to tell ‘fake’ information from the real thing?
Alessandra Galloni, Reuters’ managing editor for global news planning and creation, took up just that question as the moderator of a panel called Misinformation in Markets and Media, held Tuesday afternoon at the World Economic Forum in Davos, Switzerland.
Galloni noted that video of the Ukrainian passenger jet downed by Iranian missiles came from someone who posted it to YouTube. In such a case, it falls to news organizations such as Reuters to determine if the footage is genuine. But to show how difficult that can be, Galloni showed the audience a few videos. One showed a tanker blast, supposedly captured by a vehicle’s dashcam. The audience was about evenly split as to whether the video was genuine, which it was. Another video purported to show an Indian air force strike against a suspected terrorist hideout. The audience largely believed this one to be authentic, but it actually came from a video game.
Even in the short time since these clips were assembled, said Galloni, “the technology has moved even faster in allowing the modification of videos.” The themes of the speed of technological change — and the inability of regulators or self-regulation to properly address it — would recur throughout the panel.
“Fundamentally the problem we face is essentially, people don’t trust institutions, and sometimes seem to be right not to trust them.”
Panelist Rasmus Kleis Neilsen, Director of the Reuters Institute, noted that some of our fundamental problems concerning information will never be solved because powerful people will always lie, and we’ll always disagree over big grey areas of fact and fiction. But there’s another, deeper issue, he said. “Fundamentally the problem we face is essentially, people don’t trust institutions, and sometimes seem to be right not to trust them,” he said, providing as examples the “deliberate denial and man-made obfuscation around climate change and misinformation in the run-up to the Iraq war.”
“From the point of view of the elite,” Kleis Neilsen continued, “the problem is bad actors doing bad things. From the point of view of the public, much of the public has little or no confidence in established institutions or established media.” The solution is to rebuild trust in our public institutions, and to support independent journalists, he added.
Helen Clark, the former Prime Minister of New Zealand, agreed, calling for continued investment in public broadcasters, and asking if educational systems have a larger role to play.
Another panelist, Hao Li, associate professor of computer science at the University of Southern California, spoke convincingly about how easy it is to manipulate video images such those that began the session. Since 2017, Li said, the open source code to change one’s appearance in a video has been widely available, with limited resources required: a standard gamer PC with a “decent” CPU [processor], which costs about $1,500, and a few thousand images of the person you’re trying to impersonate.
The impact of social media
The panel also discussed two particular examples of the failure of social media to curtail harmful content. One involved India’s Yes Bank, which in September was targeted by a social media campaign claiming the bank was about to shut down. “In a situation like this, where there is a huge amount of information available on social media, no one can edit it,” said panelist Rajnish Kumar, Chairman of the State Bank of India, adding that machine learning or artificial intelligence could take the place of a human editor.
A similar situation developed around the tragic shooting in Christchurch, New Zealand, in which the killer had declared his intentions on Twitter and video of the massacre spread quickly on YouTube. “The traditional regulatory mechanisms don’t adapt to this world at all,” said former Prime Minister Clark. Facebook, she said, claimed no one had ever tried to post such a thing before — so therefore, the site was unprepared. Indeed, what happened at Christchurch launched “an international call for how to combat terrorism on media,” she noted, adding that despite that “we always seem to be playing catch up for what it’s possible to do.”
Most of the panelists, like Kumar, were in favor of some sort of technological fix. Clark said that traditionally, New Zealand broadcasters have been subject to a three-second delay. She suggested that a similar delay could be applied to livestreaming. “Maybe there should be time for some sort of review mechanism to kick in,” she said.
Jimmy Wales, the founder of Wikipedia, agreed. “I can see that a time-delay could be incredibly helpful without imposing undue burdens on someone’s freedom of expression,” Wales said. But he noted that WhatsApp, which came in for particular criticism, is end-to-end encrypted, so the app doesn’t have an easy way of knowing what people are sharing.
Wales contended that Wikipedia doesn’t have that problem, because “everyone can edit it — our community of editors is quite vigilant.” On Wikipedia, he says, fake news is fairly quickly shut down.
The Reuters Institute’s Kleis Neilsen noted the point, but said editing was not the answer in itself. “The response is not to remove everything that is wrong,” he said. “It is to provide people with a trustworthy alternative.”