Written by Richard Lachman, Ryerson University. Photo credit: Shutterstock. Originally published in The Conversation.
Social media has allowed fake news about the Ukraine invasion to proliferate.
As the story goes, in the 1780s, a former lover of the Empress of Russia wanted to impress her with his efforts to build empire in what would later become part of Ukraine. Grigory Potemkin had workers build a façade showing a prosperous village along the riverbanks, visible from passing boats, disassembling and reassembling it further up the river as Catherine the Great sailed by.
A “Potemkin village” has become shorthand for a false veneer designed to hide the truth, but historians tell us the original story doesn’t hold up to scrutiny. In a sense, it’s fake news, 1700s style.
The region is once again the subject of a false front. Social media platforms shield falsehoods behind the trappings of “authenticity,” as especially highlighted by the proliferation and dissemination of information about the Russian invasion of Ukraine. And just like Potemkin’s villages, if we don’t examine what lies behind these façades, we risk missing the truth.
Videos circulating on TikTok show people fleeing and soldiers fighting to the sounds of gunfire, but it was later revealed that over 13,000 videos use exactly the same audio with different visuals. In another example, 20 million people viewed footage of a paratrooper during the conflict, only for a reporter to find it was originally posted in 2016.
Here's a good example of war misinfo that's plaguing TikTok right now.
This video of a parachuting soldier has 20 million views on TikTok.
The top comment? "Bro is recording an invasion."
But he isn't. This video is from 2016. pic.twitter.com/6WsjpWOLVI
— Ben Collins (@oneunderscore__) February 24, 2022
A video clip showing a Top Gun-style aerial dogfight went viral, with over two million views less than three weeks after it was posted. In it, a hotshot Ukrainian pilot nicknamed “The Ghost of Kyiv” in a MIG-29 shoots down a Russian SU-35. According to PolitiFact.com, a non-profit fact-checking project by the Poynter Institute, the clip was from a free online videogame called Digital Combat Simulator.
At the same time as falsehoods spread behind the façade of authenticity, social media is being used to tell stories from ground zero. This content empowers those affected by the conflict to tell stories from their perspective, without the clipped tones of a news anchor.
Ukrainians listening to bombs fall; a child singing Disney songs in a bunker; a soldier in full battle-armour moonwalking to “Smooth Criminal”; a teenager drying her hair in a bomb shelter.
The pursuit of authenticity
There is a value placed on authenticity, and the characteristics of amateur videos posted online present like the unfiltered truth: shaky cameras, bad lighting, patchy audio. These traits, which can be the hallmark of a real dispatch from the front, also make them easy to simulate.
Media literacy programs teach all of us how to identify and combat fake news online. Responsible social media users are supposed to check sources, search for corroborations from trusted parties, check time-stamps and assess whether the content is too good — or bad — to be true.
But the design of social media platforms ends up discouraging these behaviours. TikTok, Instagram Reels, Snapchat Spotlight and YouTube Shorts favour ultra-short videos. These videos don’t lend themselves to deep engagement: we watch, experience a few seconds of emotional impact and keep scrolling on. These platforms are also how news circulates — as people look for information about the Russian invasion, videos and information circulate online on social media.
Holding on to attention
Social media sites encourage sharing and re-posting, which means the original source of a clip is hard to track down. The platforms are designed to keep users on-site and in front of advertisers for as long as possible. Opening extra tabs to cross-check information is just not part of the experience, which helps false information spread.
This in turn leads to another danger: that we start to doubt everything we see, convinced that everything is opinion and biased, and simply someone’s point of view. Both situations are dangerous to the functioning of civil society.
So what can be done? We need more human-level moderators at the platforms to take down demonstrably false or harmful content fast. And as crises happen around the world, these moderators will need regional knowledge and language expertise.
While this will be more expensive than the algorithmic approaches the platforms prefer, it will need to become part of the cost of doing business. We need governments to collaborate in establishing regulations, fines and other forms of accountability at a scale that forces the platforms to change.
Media literacy programs need to teach a healthy dose of skepticism to audiences of all ages, yet they also need to make clear that doubting everything can be just as dangerous.
To help social media fulfil the promise of the early days of the internet as a pro-social communications tool that brings us together and lets us share our individual stories — we need governments, companies and individuals to take responsibility.
If we want to see the truth behind the Potemkin Village, we can’t keep moving past — we have to slow down and look at things more closely.