REUTERS

On July 31, Facebook announced the removal of around 30 pages and accounts on its platform for “coordinated and inauthentic behavior” surrounding politics in the United States.

In an announcement depicting actions taken on their platform, Facebook representatives explained: "We’re still in the very early stages of our investigation and don’t have all the facts — including who may be behind this. But we are sharing what we know today given the connection between these bad actors and protests that are planned in Washington next week. We will update this post with more details when we have them, or if the facts we have change," DFRLab reports.

It’s clear that whoever set up these accounts went to much greater lengths to obscure their true identities than the Russian-based Internet Research Agency (IRA) has in the past.

Видео дня

Facebook concluded, based on internal data, that the accounts were “inauthentic,” and did not represent the individuals and groups they claimed to.

The initial findings say the pattern of behavior by the accounts and on the pages in question make one thing abundantly clear: they sought to promote divisions and set Americans against one another.

Their approach, tactics, language, and content were, in some instances, very similar to accounts run by the Russian “troll farm” or Internet Research Agency between 2014 and 2017.

Read alsoA withering verdict: British MPs report on Zuckerberg, Russia and Cambridge Analytica

Similarities included language patterns that indicate non-native English and consistent mistranslation, as well as an overwhelming focus on polarizing issues at the top of any given news cycle with content that remained emotive rather than fact-based.

The set of accounts appeared, however, to use much stronger operational security. They maintained a focus on building an online audience then translating it to produce events — such as protests — in the real world. Further, this specific set of accounts was focused exclusively at engaging and influencing the left end of the American political spectrum.

Read alsoEstonia Intelligence chief says network of operatives pushing Russian agenda in West - VOA

Of note, the events coordinated by — or with help from — inauthentic accounts did have a very real, organic, and engaged online community; however, the intent of the inauthentic activity appeared to be designed to catalyze the most incendiary impulses of political sentiment.

Given Facebook’s conclusion that these accounts were inauthentic, they appear to have constituted an attempt by an external actor — possibly, though not certainly, in the Russian-speaking world — to infiltrate left-wing American communities. One of them, @resisterz, in particular, attempted to mobilize its audience for a confrontation with the far right.

Such online activity poses a danger of both disinformation, which experts define as deliberate spread of false information, and misinformation, defined as the unintentional spread of false information. The Russian operation in 2014 through 2017 showed how easily disinformation actors could seed their falsehoods into genuine American communities on the right and the left; Americans thus became the unwitting amplifiers of Russian information operations.

The accounts which Facebook suspended appear to have been primed to take that approach, and, more explosively, to trigger standoffs between genuine Americans, bringing the risk of real-life violence from false stories.

Read alsoRussia’s troll factory launches new website targeting Americans

Their behavior differed in significant ways from the original Russian operation. Most left fewer clues to their identities behind, and appear to have taken pains not to post too much authored content. Their impact was, in general, lower, compared with the 300,000 followers amassed by Russian troll account “Black Matters.”

Information operations, like other asymmetric threats, are adaptive. These inauthentic accounts, whoever ran them, appear to have learned the lessons of 2016 and 2017, and to have taken more steps to cover their traces. This was not enough to stop Facebook finding them, but it does reveal the challenge facing open source researchers and everyday users.

Their exposure underscores the ongoing threat which faces American social-media users on either side of the political spectrum. It would be dangerous to fall into the disinformation trap, but ruinous to believe or claim that every user who holds opposing views is part of a Russian information operation.

Above all, the exposure of these accounts reinforces the need for evidence-based analysis, clear open-source criteria for identifying influence accounts, and heightened awareness as the mid-term elections approach.