Friendly Sirens and Deadly Shores
By Muhammad Idrees Ahmad
As the US prepares for another high stakes election, the outcome is likely once again to be influenced by a third party: Russia. But only if the electorate cooperates.
Ahead of the 2016 election there were frequent mentions of Russian interference, but its possible impact was generally dismissed. Democrats were convinced their candidate would win; and Republicans, resigned to the same, treated Russia as a side issue. The outcome jolted everyone. Because of this, no one is discounting the threat this time. But the underlying causes that helped foreign actors succeed have grown deeper. There is now greater awareness about Russian tactics, but a stronger resolve will be needed to resist them.
In the myriad investigations, few stones have been left unturned about the methods and scope of Russia’s intervention. But while Russia has shown ingenuity in using digital propaganda, its success derives less from methodological sophistication than from structural vulnerabilities. To have any hope of countering Russian “active measures”, it is important therefore to understand not just the dissemination of propaganda, but also its reception. Propaganda, ultimately, is a cooperative enterprise. It feeds on existing biases. It requires both an active audience, which already shares the propagandist’s assumptions, and a larger, passive audience, which imbibes it based on the legitimacy accorded it by the active audience. People are susceptible to propaganda because it offers affective rewards and reduces cognitive labor. That is why any discussion on how it functions needs to begin with why it works.
- Disinformation depends on humans being human. Because of the way the human brain has evolved, its primitive, reflexive systems respond to information more rapidly than its evolved, reflective parts. Intuition precedes deliberation. This makes belief effortless while skepticism is laborious. Successful propaganda deploys affective triggers to short circuit the reflective systems of the brain. It uses emotive language and evocative symbols to confirm our biases, eliciting reflexive responses and preempting careful deliberation. In this manner, disinformation hijacks human autonomy: it inspires or inhibits action in accord with the propagandists aims. Its deceptions are aimed at getting people to act differently than they would otherwise. In a time of raised emotions, Republicans may be encouraged to promote #Pizzagate or #QAnon and Democrats may be induced to adopt My Candidate-or-Bust positions.
- Disinformation feeds on discontent. The renowned propaganda theorist Jacques Ellul made a distinction between sociological and political propaganda. Our worldviews are functions of our sociology: we are conditioned by family, class, community and education. This shapes the parameters of our political beliefs. Political propaganda is effective when it works within these parameters. We are more receptive to messages that resonate with our beliefs; and our fears are more easily provoked than our hopes. This is why racial anxieties and fear of immigrants are fertile ground for manipulation. The aim of computational propaganda is to manipulate us based on our existing concerns. That is why the psychometric profiles created by Cambridge Analytica were so insidious: they could target individuals according to their anxieties, fears, prejudices. Facebooks advertising features already allow a significant degree of micro-targeting. This feature was used in the earlier election cycle to depress the non-white vote by playing to their disenchantment with the political system.
- Disinformation exploits social networks. Disinformation spreads faster than information. Computational propaganda has made such manipulation easier. Disinformation cascades through our social networks and snowballs into an unruly political force. It creates greater polarisation because people are isolated in circles of affirmation, reinforcing each other’s beliefs, resisting contrary opinion. People 65 years and older are particularly susceptible to this, spreading fake news; but younger people also have shown an inability to distinguish news from nonsense.
Once we understand why disinformation works, its means and motivations become less mystifying.
- Disinformation manufactures doubt. The manufacture of doubt has been a standard feature of propaganda since the tobacco industry first used it to thwart government regulation. Russia has given it institutional force—from the presidency down to the national broadcaster—erecting a phalanx of lies around every inconvenient fact. Russia Today (RT) uses the motto “Question More” to blur the distinction between scepticism and cynicism, encouraging viewers instead to believe nothing. It is a spur to motivated reasoning (i.e. the cognitive bias which makes us demand a higher threshold of evidence for claims that go against our intuitions) Doubt is politically immobilising; it induces inertia. Manufactured doubt is distinct from the kind of legitimate scepticism that inhibits unqualified belief. It is functional—it relativizes truth and encourages cynicism. Such doubts have been useful in suppressing voter turnouts; they have also created distrust of the media, leaving audiences susceptible to “alternative” propaganda.
- Disinformation fogs the past and disrupts the future. Disinformation smokescreens hostile intent. It ties up the public in tangential debates, while proceeding unimpeded on its subversive course. Energy companies have used residual doubts about anthropogenic climate change to thwart environmental regulation. Russia has created doubt about chemical attacks by its Syrian ally to buy impunity for further violations. And to conceal its massive interference in US elections, Russia has exploited the notion of #Russiagate—a mocking reference to concerns about Russian interference—to tie up America in a debate whether the interference even happened. And while people are busy litigating the past, Russia proceeds unimpeded with another intervention.
- Disinformation can be lethal. In Syria, Russia deployed disinformation for the specific purpose of justifying war crimes. It inverted reality, painting rescuers and medics as terrorists, to justify its systematic attacks on first responders. Disinformation can be lethal, as we learned from the #pizzagate conspiracy theory, which induced one agitated avenger to shoot up a pizzashop where he believed Hillary Clinton was running a paedophile ring. For Russia, disinformation is not just a communication tool, but a weapon in a hybrid war. The aim is to create chaos. For democracy, which relies on an informed citizenry, its effects can be fatal.
To understand human susceptibility to malicious messaging is to recognize the multi-dimensional challenge that disinformation poses. Digital literacy can only partially mitigate the problem: the solution has to be broader, involving the media, state and society.
The media has already made some adjustments with a renewed emphasis on fact-checking. But the distinction between fact and opinion needs to be more strictly enforced. The expansion of the media eco-system with dedicated fact-checking organizations is a welcome development. Little of this will matter, however, until social media platforms like Facebook and Twitter show greater willingness to police malicious content. The state can demand greater responsibility from social media giants by classifying them as publishers instead of tech companies. But this requires political will.
In 2016, the means existed for thwarting Russian interference before the election, but no action was taken as its significance was underestimated. In 2020, there are no doubts about disinformation’s potential to influence the election, but no action will be taken because the government in power is likely to benefit from it. It remains to the society, then, to create an air of intolerance towards disinformation, treating its purveyors with prejudice, and imposing a reputational cost. In the longer term, our education systems will need to be overhauled with digital literacy introduced as a necessary component.
As individuals, we may not be able to shed our biases, but we can certainly do more to resist being manipulated through them. The psychologist Daniel Kahneman makes a distinction between two types of thinking: fast, reflexive, intuitive; and slow, reflective, deliberative. If the aim of disinformation is to appeal to our intuition to short-circuit careful reflection, then we’ll have to learn to force ourselves to slow down and scrutinize our intuitive responses to new information. To learn to examine sources, verify claims, and assess arguments, especially when they affirm our beliefs. In Homer’s Odyssey, when the eponymous hero is warned of the Sirens who songs lead seafarers to shipwreck, he has his shipmates cover their ears and tie him to a mast so he can resist their enchantment. As we get closer to November, like Odysseus, we’ll have to fasten ourselves to our slow, deliberative, skeptical minds so we can resist the sirens and avoid the rocks.