An Anti-Propaganda Group & Their Information Warfront

IT#StandForUkraine, a volunteer group of Ukrainians looking to curb Russian propaganda, has set the widely used Russian social media platform VKontakte as their office.

May 2022

A screenshot of Stand for Ukraine's website.

In the first few days of Russia’s invasion into Ukraine, Oksana Moroz published a post on her Facebook page. “Leave a comment,” the post read, if anyone would like to share fact-based news of the war on Russia’s largest social network, VKontakte (VK). Within 10 minutes, the post had over 100 likes. At the end of the first day, the number grew to almost 500 and nearly half as many shares.

What it entailed at that point, though, Moroz was unsure. There was no team, no structure, and no playbook yet — only the idea. What she did have was hundreds of volunteers willing to reach out, each tasked with a simple instruction: speak to Russians like Russians on VK. 

IT#StandForUkraine is a volunteer organization that now includes over 1,000 Ukrainian specialists in technical and creative fields. They build databases, create infographics, and code bots for the messaging platform Telegram, which is popular in Russia. It moved out of the group’s personal communication, and garnered support from the Armed Forces of Ukraine, the Security Services of Ukraine, the Ministry of Internal Affairs of Ukraine and the Ministry of Digital Transformation of Ukraine.

Nika Tamaio Flores, another leader of this group who has worked in the digital sphere with expertise in machine learning, artificial intelligence and other IT subsets, met Moroz through Instagram because someone took a screenshot of her post and shared it. Moroz has published multiple books surrounding “information hygiene,” according to her website. The pair were well-equipped to help lead an organization of this caliber.

Throughout March and April, while volunteers worked remotely across Ukraine, Moroz connected with an IT specialist who developed their main program: VKSpammer, the heart of their organization. VKSpammer is accessible through their @stop_info_war Telegram bot and is downloadable. The bot also provides directions for those who wish to participate in their anti-propaganda mission. Users can use this bot to access a list of public groups to post and comment on, receive fake accounts, and a list of accounts to message. The creator of the program, Ruslan Isaev, could not be reached for comment.

After analyzing Russian users, the group identified a way to reach them, and Ukrainians, at scale: Telegram bots. Telegram is an end-to-end encrypted messaging system that is advertised as very secure and has become a primary news source and communication mechanism for Ukrainians and Russians during the war, despite censorship of Russian information. Telegram bots run as third-party applications within Telegram, and “users can interact with bots by sending them messages, commands, and inline requests,” according to their website. The inline requests are set up by the bot creators, and a user can send a chat that will return information using the bot.

One of the first actions of the group was to create four assistance-based Telegram bots. At the core of their mission on the information warfront is the Propaganda bot, which was “created for the war on the information front.” It urges users to assist Ukraine by blocking complaints from war supporters and collects data about propaganda and other “disseminators of disinformation,” but moderators add only verified information. The group also identified other needs to help those in Ukraine: The SaveUA bot, which allows for users to find and provide assistance to those in Ukraine, whether they are a volunteer who wants to help or someone who need access to food, water, housing or medicine; the First Aid Robot which provides first aid instructions during martial law; the STOP Russian War bot, which allows users to send location, movement, and other information about the Russian army directly to the Headquarters of the Armed Forces of Ukraine; and the UA Veteran Bot that provides information for veterans and families about evacuations and counseling, among others things. 

The efficiency of the bots is measured like typical mobile apps or webpages, using monthly active users and daily active users. The group can also track every button that is clicked on the bots, says Flores. When tracking usage of the SaveUA bot, for example, they measure the number of requests they receive, the number of volunteers, and the number of requests for assistance that were successfully fulfilled. 

The chat the group uses now to speak with one another is private, after a massive spam attack, and members can only be added if they are vouched for by another member of the group.

“[People were added] who did nothing, but activity interfered,” Moroz said. “It was clearly not volunteers, but enemies.” The new group consisted of new volunteers, which includes more than 2,500 people. “This is routine work; people get tired quickly. That’s why we have a constant process — new people have joined, old people have stopped working, and so on.”

Concerns of privacy are always at the forefront of digital activism, and while Flores says she is not very active on social media and hasn’t faced any danger, some volunteers have.

“I know some of the people who are responsible for social media ads and public relations focused on Russia have been exposed in pro-Russian Telegram channels or smaller news outlets,” she said. “Most of them use fake accounts in Telegram and update their privacy settings. However, I strongly believe that Russians cannot really harm our activists and volunteers apart from online bullying.”

Olga Lautman, co-host of The Kremlin File podcast and a CEPA fellow whose research is focused on hybrid warfare: cyber-attacks, disinformation campaigns, and anything Russia uses to support their democracy, says VK is highly controlled by the Putin regime. After the initial assault on Ukraine, Putin signed a decree making the use of the word “war,” sharing pictures, discussing the truth or anything that is “outside of their very limited Special Operation punishable by up to 15 years in prison,” Lautman said.

“VKontakte is against offensive and illegal content, including hate speech and calls for violence,” VK’s press team wrote in an official statement. VK says they delete content immediately when receiving reports from users, and that every report is reviewed. “We not only respond to reports from users, but we also conduct proactive internal monitoring,” which they say includes the “use of digital fingerprints” to identify illegal or offensive content.

#ITStandForUkraine volunteers have used backdoor ways to reach the Russian audience. On March 20, an anonymous user hacked VK and wrote a message to over 12 million people in Russia explaining the true casualties of the war. According to Lautman’s Substack page which she later confirmed, the message told Russian users about the truth of the invasion of Ukraine, and how civilian infrastructure had been “destroyed and damaged due to artillery and airstrikes.” 

“None of the assigned tasks of the special operation has been completed, the Ukrainians have rallied and in a single impulse rebuff the aggressor,” the message says, according to Lautman’s translation. Lautman adds that the message also shared that the Russian army has suffered major losses of troops, but that Russian propaganda media outlets have not shared the gravity of the situation. 

“It clearly wasn’t a Russian speaker; whoever wrote the message [probably] used Google Translate,” Lautman said. “They actually wrote the truth on how many [Russian] soldiers to date, approximately, have died. [They also] wrote how much military equipment has been destroyed [and that] Russia is specifically targeting civilians because in Russia, they are saying that they are only targeting military infrastructure, and not that they’re touching millions of civilian infrastructures.”

Through censorship and punishment, Putin has made it clear just how easy it is for his administration to filter information, but hackers and information groups like IT#StandForUkraine are fighting against his propagandic narratives. 

“The world overestimates Russian propaganda as well as [the] Russian army. What a joke,” wrote the PR department of the Ministry of Digital Transformation of Ukraine in a statement, who supports IT#StandForUkraine by attracting and promoting people to join the group using their Special Task Force of the Internet Army of Ukraine Telegram channel. “Messages they are spreading just ridiculous: ‘biolaboratories around Ukraine; birds-soldiers; murdered bodies of people in Bucha — fakes, etc.’ Anyone with access to the internet and critical thinking can check the real facts and disprove it.”

“However, Russian propaganda deeply damaged Russians’ minds. The only people who believe in Russians fake news are Russians … We feel sorry for these people,” they added.

Targeting Russian VK Profiles

To create its “bot army,” or accounts accessible through their @stop_info_war Telegram bot, the group started by purchasing real VK accounts from an online marketplace. The organizers declined to say which website they used but many exist. One such site sells VK accounts for as little as 27 cents each, but it is unclear who created the marketplace or how it accesses these accounts.

In this market, the group can search for specific account parameters that they’re looking for, similar to how Facebook ads target specific groups of people. If they are looking for a woman who is around the age of 40 and from Moscow, the group can buy this account. When the accounts get banned, which is an often occurrence, the group goes back to the market to purchase more as they are relatively inexpensive.

Moroz says that they choose accounts that have many friends, as accounts with little to no friends do not assist in their overall goal: reach as many Russians as possible. Specific parameters are important to IT#StandForUkraine’s strategy, as their targeted messaging is tailored to who they are messaging, whether it is the mother of a soldier, someone with a specific job role, among other things.

Their messaging falls into two main categories: economic hardship and conversations related to the war. The economic hardship category, which they call “fridge,” focuses on price changes due to the war, companies that are no longer in operation, and the disappearance of goods. Flores and Moroz believe focusing on the topic of war just turns people away, as the Russian people have proven through trial and error to respond to messages pertaining to the economy and job loss.

An example of counterpropaganda used by IT#StandForUkraine is within their “fridge” category. A purchased Russian account will send a message to one of the account’s friends, asking if said user has a job and tells a story about a friend who recently received their last paycheck. This is a way to push Russian VK users into asking questions about the state of their economy, and if job loss is becoming more prominent.

Their war-related category is their “militaristic” focus, depending on the audience. It is used to message family and loved ones of suspected Russian soldiers to force some type of conversation. A part of a message sent by the group to an account on VK, translated from Russian, says: “My son has never had anything to do with military affairs! Why should he be taken to this f*cking Ukraine?”

The group may not always receive the kindest responses, but they know people are seeing the content that they’re putting out, Flores says.

For volunteers to access these purchased accounts to perform targeted messaging, they use the @stop_info_war bot. Each person who clicks the “accounts” button will receive instructions and three purchased accounts to work with.

After their initial roll-out of contacting people, Moroz and Flores decided to escalate the campaign. On March 1, Vovka Tadjik saw an opportunity for a new front in the digital war: contributing information on 120,000 Russian soldiers to Pravda, a Ukrainian newspaper. This information included their birthdays, addresses, unit affiliations and passport numbers. The group used semi-automatic ways of finding these soldiers and their relatives on VK, and tailored messages accordingly.

In this messaging, volunteers point family members and alleged relatives of soldiers to @rf200_nooow, a Telegram channel, and 200ru.net, a website — created by the Ministry of Internal Affairs of Ukraine in partnership with IT#StandForUkraine — both of which provide information on Russian soldiers who have either been killed or captured since the beginning of the invasion. The site, however, has been blocked by the Russian Federation, so those in Russia were redirected to the constantly updated Telegram channel.

These types of disinformation campaigns are difficult to counter, as leaders and autocrats like Vladimir Putin have learned how to use the tools of the internet, says Ann Cooper, a former NPR Moscow bureau chief and professor at Columbia University. In the internet’s early days, she says, it was seen as a great new tool for democracy, to hold governments accountable and to get around censorship efforts, but that is no longer the case.

Combatting Russian Trolls — Putin’s Internet Army

Flores says Russia is doing their own work on the information warfare front using paid online “trolls,” or fake commenters on VK posts who confront truthful information in an attempt to trick readers into believing another narrative. But, compared to IT#StandForUkraine, Flores is surprised that anyone can possibly believe the comments these accounts post.

“They’re in VK, Twitter, Telegram. You see ‘war’ or some other words that trigger them and then those trolls attack you with the exact same messages and this is so strange,” Flores said. “They don’t even give a bit of human touch — they’re just copycatting each other, and I don’t [understand] how it works.”

VK says their moderation team is constantly monitoring content on the platform and “blocks users and bot networks that send mass mailings on a single topic,” but it is unclear whether their moderation tactics can clearly identify Russian trolls. #ITStandForUkraine’s messaging is outside of this purview, as each message is written explicitly and differently — they are not flagged as spam or bots.

Clemson University Associate Professor and co-leader of the Media Forensics Hub Darren Linvill, who has worked extensively on analyzing and studying the spread of disinformation using Russian trolls on Twitter, says the most famous Russian troll farm is the Internet Research Agency (IRA) that is based in St. Petersburg, Russia and led by Russian oligarch Yevgeny Prigozhin who has close ties with Putin. Linvill explains that these trolls will build accounts “up from the ground in a very artisanal fashion,” gain followers organically, and work within specific communities online.

“They then use the influence that they gain over time on those accounts to pull those communities in a direction that supports whatever message it is that the Kremlin wants to propagate,” Linvill said.

While Flores says these trolls are “copycatting” each other, Linvill says their methods have shifted based on moderation practices. At first, these trolls would copy and paste often, but discovered it was easy to be identified as a troll and lose their account. “It’s copying and pasting ideas, not sentences,” he said.

In order to effectively curb propaganda and to attempt to counteract trolls, Flores says she contacted an old acquaintance, Leonid Volkov’s wife. Volkov’s husband is the chief of staff of Navalny’s Foundation for Fighting Corruption, a Russian opposition network that has since been censored and deemed “extremist.” Flores asked for insight on Russians’ psychology and reviews on the ads shared by IT#StandForUkraine during the first three weeks of the war.

“They said ‘Imagine people live in a fake world so everything that they hear is like fake propaganda or someone else saying it’s propaganda or propaganda saying the West is going to destroy us because we are eternal enemies,’” she said. “We even tried to promote Navalny’s antiwar meetings, but with no success. So, after a while we stopped using any hints on Russian opposition.”

Trolls are more common on platforms where the Russian government can exert less control. They’re a safeguard against the factual information users find on them, Linvill says. Putin knows that banning all social media platforms would steer his people to find another way to access information, so the trolls are widely found on Twitter, for example.

“The accounts that we believe are likely state-run accounts are operating on platforms that are currently banned in Russia and are still running and putting out messages right now,” Linvill said. “There’s no reason to stop. It’s also really inexpensive.”

Putin’s administration has taken one step after another in controlling the media, what can be published, and punishing people who violate the government’s understanding of what media is, says Tom Kent, a specialist on disinformation, Russian media and journalistic ethics at the School of International and Public Affairs at Columbia University.

“Information is one of those things that if there’s a crack in the wall, then the whole wall is pointless,” Kent said. “He’s got to make sure that nothing is getting in and that’s extraordinarily difficult because there’s just so many sources of information and so many places to go on the internet … it’s a huge task.”