Covering Scientific & Technical AI | Tuesday, October 8, 2024

FBI Shuts Down Russian AI-Enhanced Bot Farm 

The US Justice Department has announced the disruption of a Russian-government-backed bot farm that used AI tools in a massive propaganda campaign. Using AI to create fake social media accounts that often claimed to be US citizens, the malicious actors involved in this bot farm hoped to promote messages in support of Russian government objectives.

The seizure of two website domain names and the shutdown of 968 social media accounts marks one of the largest disruptions of a state-sponsored disinformation campaign in the digital era.

“Today’s actions represent a first in disrupting a Russian-sponsored Generative AI-enhanced social media bot farm,” FBI Director Christopher Wray said in a statement from the Department of Justice. “Russia intended to use this bot farm to disseminate AI-generated foreign disinformation, scaling their work with the assistance of AI to undermine our partners in Ukraine and influence geopolitical narratives favorable to the Russian government. The FBI is committed to working with our partners and deploying joint, sequenced operations to strategically disrupt our most dangerous adversaries and their use of cutting-edge technology for nefarious purposes.”

According to court documents, the creation of this social media bot farm can be linked back to a Russian person (identified as Individual A) who served as the deputy editor-in-chief of the state-run Russian news Organization RT.

The leadership of RT has been working to create alternate channels for disseminating news outside of RT's regular television newscasts since at least 2022. Individual A oversaw the creation of software that could establish and run a social media bot farm. According to the plan, the social media bot farm would generate fake online personas for social media accounts so that RT or any other bot farm operator could disseminate information widely.

The court documents go on to explain that in early 2023, with the approval and financial support of the Presidential Administration of Russia, a Russian FSB officer created and led a private intelligence organization. This organization had many members, one of whom was Individual A from RT.

Working as a coordinated group, this private intelligence organization used this bot farm and the Meliorator AI software to distribute pro-Russian narratives in the guise of social media posts from “real” Americans.

Meliorator AI Helps Spread Disinformation

While this malicious campaign required a lot of human participation, the FBI was also quick to point out the valuable role AI played in this act. Specifically, a joint cybersecurity advisory report from the Canadian Centre for Cyber Security (CCCS), the Netherlands General Intelligence and Security Service (AIVD), Netherlands Military Intelligence and Security Service (MIVD), and the Netherlands Police detailed how the Meliorator AI-enhanced software helped with this disinformation campaign.

The report states that Meliorator was used by Russian actors to accomplish the following:

  • Create authentic-appearing social media personas en masse
  • Deploy content similar to typical social media users
  • Mirror disinformation of other bot personas
  • Perpetuate the use of pre-existing false narratives to amplify malign foreign influence
  • Formulate messages, to include the topic and framing, based on the specific archetype of the bot.

While the FBI’s report is US-centric, this Russian-backed bot farm was not solely targeting Americans. The farm also disseminated disinformation to and about a number of countries, including the US, Poland, Germany, the Netherlands, Spain, Ukraine, and Israel.

The software package offered through Meliorator basically consists of two main components. First is Brigadir, which serves as the primary end-user interface of Meliorator and functions as the administrator panel. Brigadir is the graphical user interface for the Taras application and includes tabs for “souls,” or false identities that create the basis for bots. Brigadir also has tabs for “thoughts,” which are automated scenarios and actions that can be implemented for the bots. An example of this is sharing content on social media in the future.

Behind Brigadir is Taras, which serves as the back end of the Meliorator software package and contains the .json files used to control the personas spreading disinformation on social media.

The report states that Meliorator currently only works on Twitter. However, analysis of this software package suggests that it could be expanded to other social media networks.

Below are some examples of work performed by the bot farm printed by the Department of Justice in its statement:

  • A purported U.S. constituent replied to a candidate for federal office’s social media posts regarding the conflict in Ukraine with a video of President Putin justifying Russia’s actions in Ukraine;

Credit: US DOJ

  • A purported resident of Minneapolis, Minnesota, posted a video of President Putin discussing his belief that certain geographic areas of Poland, Ukraine, and Lithuania were “gifts” to those countries from the Russian forces that liberated them from Nazi control during World War II;

Credit: US DOJ

  • A purported U.S. resident of a city identified only as “Gresham,” posted a video claiming that the number of foreign fighters embedded with Ukrainian forces was significantly lower than public estimates;

Credit: US DOJ

Credit: US DOJ

  • The same purported individual posted a video of President Putin claiming that the war in Ukraine is not a territorial conflict or a matter of geopolitical balance, but rather the “principles on which the New World Order will be based.”

Credit: US DOJ

To make these fake social media accounts seem real, the bot farm relied on private email servers, which used the two website domain names seized by the FBI. Specifically, these emails were created using the domain names mlrtr.com” and “otanmail.com” from a U.S.-based provider.

While disinformation campaigns like this are immoral in their own right, the FSB’s use of US-based domain names to register the bots violates the International Emergency Economic Powers Act. What’s more, the payments for the infrastructure behind this bot farm violate federal money laundering laws.

The Department of Justice’s investigation into this case is currently on-going. However, this success on the FBI’s part should be viewed as a warning. AI tools are rapidly becoming more versatile and sophisticated, and disinformation campaigns such as this will only become more common.

AIwire