Over a year ago, Russia launched its full-scale invasion of Ukraine, killing as many as 8,500 civilians and injuring another 14,000, according to United Nations estimates. Beyond the artillery and bombs, Russia has been waging an information war in the Balkan countries to promote falsehoods that legitimize its war on Ukraine. The propaganda includes false claims that Russia is protecting Ukraine from Nazi insurgents, that Ukraine has biological weapons and narratives that Ukraine is corrupt.
On April 11, the Biden Administration met with Balkan leaders to pledge support for their efforts to shut down Russian news sites disseminating disinformation. U.S. support could go a long way toward helping Balkan countries in their attempts to remain autonomous from Russia, which uses disinformation to try to undermine democratic governments in the post-Soviet space.
For the past three years, intelligence analysts and disinformation researchers at my company have been closely following the tactics and strategies of Russian disinformation operations. Here are some suggestions that the U.S. could take that would make even more of an impact in countering the anti-Ukraine disinformation being disseminated by Russia.
The State Department’s suggestions to block websites are a start, but will likely only impact the low-hanging fruit and activities of unsophisticated actors. Countries should not only be targeting the IP addresses and domain names of sites spreading falsehoods, but also the companies that host and support those web addresses. Other State Department-recommended efforts – such as labeling foreign government accounts, and enacting regulations that require transparency around foreign ownership of media properties – will take time to implement. There are some other steps that the U.S. could take in order to help counter the effects of Russian disinformation in the Balkans.
We know that Russia routinely uses bots – computers that pose as humans for disinformation campaigns – to exaggerate the “impact” of its messaging. A lot of Russian disinformation is actually remarkably crude and unconvincing for increasingly skeptical internet audiences, so they want to create the illusion of impact. In today’s hyper-networked media environment, having a “successful” comms strategy is an important component of great power status.
Therefore, it would be worthwhile to increase our capacity to detect bots and trolls by investing in the development of resources, personnel and software that can find the bots. Fact-checkers can use AI-based tools and large language models to recognize disinformation, however the lack of automation and the increasing sophistication of Generative AI-created disinformation makes it challenging to keep up.
Current U.S. resources are insufficient to be competitive in the fight against disinformation. Meanwhile, Russia has invested heavily in state-aligned international broadcasters, RT (Russia Today), RT Balkan and Sputnik News, purveyors of disinformation, as well as social media manipulation and hacking. Although the U.S. created the Global Engagement Center dedicated to fighting disinformation in 2016, countering disinformation has not been a budgetary priority.
For the past few years, the Kremlin has paid “influencers” and popular bloggers to spread propaganda and disinformation. Although major social media platforms restricted many official Russian government accounts, these so-called “influencers” are still operational. Social media accounts owned by the Russian government, lawmakers, affiliated media, and influencers should be deleted to block this propaganda channel.
Another important policy the U.S. should consider pursuing, is to sanction fake news outlets and pro-Russian disinformation actors who are spreading propaganda under the guise of journalism. The EU has sanctioned RT and Sputnik News, but U.S. sanctions would hold even more sway. Sanctions are one of the most powerful tools the U.S. has in fighting disinformation.
The U.S. and European governments should urge all Western allies and partners, especially eastern European countries like Georgia and Moldova, to take similar steps as those listed above.
It’s important to understand that many of the disinformation narratives pertaining to Ukraine that are circulating in the U.S. don’t originate from Russia, but come from far-right actors living in the U.S. who are anti-Biden and anti-Ukraine. Thus, U.S. officials need to recognize that pro-Russian disinformation is not purely a foreign policy issue, but also a domestic one for them.
With its global influence and resources, the U.S. has the capability to make a significant difference in the battle against Russian disinformation and establish a model for other world leaders to follow. These recommendations would go a long way toward leveling the disinformation playing field.
Noam Schwartz is CEO and co-founder of ActiveFence, an Israel-based provider of threat intelligence and safety services for online content platforms.
Source: Federal Times