Russian disinformation on Facebook targeted Ukraine well before 2016 U.S. election - WP
In the spring of 2015, Ukrainian President Petro Poroshenko was desperate for Mark Zuckerberg’s help. His government had been urging Facebook to stop the Kremlin’s spreading of misinformation on the social network to foment distrust in his new administration and to promote support of Russia’s invasion and occupation of parts of Ukraine.
To get Zuckerberg’s attention, the president posted a question for a town hall meeting at Facebook’s Silicon Valley headquarters. There, a moderator read it aloud, The Washington Post recalls.
“Mark, will you establish a Facebook office in Ukraine?” the moderator said, chuckling, according to a video of the assembly. The room of young employees rippled with laughter. But the government’s suggestion was serious: It believed that a Kyiv office, staffed with people familiar with Ukraine’s political situation, could help solve Facebook’s high-level ignorance about Russian information warfare.
“You know, over time it’s something that we might consider,” the chief executive responded. “So thank you for — the Ukrainian president — for writing in. I don’t think we’ve gotten that one before.”
In the three years since then, officials here say the company has failed to address most of their concerns about Russian online interference that predated similar interference in the 2016 U.S. presidential election. The tactics identified by officials, such as coordinated activity to overwhelm Facebook’s system and the use of impostor accounts, are the same as in the 2016 contest — and continue to challenge Facebook ahead of next month’s midterm elections.
“I was explicitly saying that there are troll factories, that their posts and reposts promoted posts and news that are fake,” Dmytro Shymkiv, then deputy minister of the presidential administration, said he told Facebook executives in June 2015. “They are promoted on your platform. By very often fake accounts. Have a look.”
Ukraine’s warnings show how the social media giant has been blind to the misuse of Facebook, in particular in places where it is hugely popular but has no on-the-ground presence. There is still no Facebook office in Ukraine.
Facebook officials defend their response to Ukrainian officials. They said Shymkiv did not raise the issue of Russian misinformation and other tactics in the meeting but that he talked instead about the company’s standards for removing content. They also said what they were alerted to in Ukraine was not a preview of what happened in the United States during the 2016 election.
In Ukraine, Russian information warfare was in full swing on Facebook and a Russian social media network during the revolution in 2014, government officials say. There was a daily flood of fake news condemning the revolution and trying to legitimize the invasion by claiming Ukraine was an Islamic State safehaven, a hotbed for Chechen terrorists and led by Nazis.
“We tried to monitor everything, but it was a tsunami,” recalled Dmytro Zolotukhin, then working for the new Ukrainian government’s Information Analysis Center of the National Security and Defense Council, which investigated online disinformation. “Thousands of reports of fake news on fake pages came in.” With the help of hackers and other cyber experts, he says he traced some of these accounts back to the Kremlin, which was also amplifying the false claims on dozens of fake online publications.
After the revolution in 2014, and again in 2017, Facebook suddenly banned dozens of accounts owned by pro-democracy leaders. Zolotukhin and others concluded that Russian bots were probably combing past comments and posts looking for banned terms and sending their names and URLs of the account owners to Facebook with complaints.
Another problem was someone — they believe it to be Russia — created impostor Facebook accounts of real government ministries and politicians, including Poroshenko. The impostor accounts posted incorrect and inflammatory information meant to make the government look bad, said Zolotukhin, now the deputy minister of information policy. He and others begged Facebook through its public portal to add verification checks next to the real accounts and remove the fakes. But usually no action was taken.
“I asked for six months for my verification,” said Artem Bidenko, state secretary of the Information Ministry, who said someone had created a fake account using his name.
Shymkiv and others met to figure out how to get Facebook’s attention when they learned of the May 2015 town hall meeting with the Facebook CEO.
One town hall question — with a record 45,000 likes — asked whether the Ukrainian accounts were the victim of “mass fake abuse reports.” Zuckerberg replied that he personally had looked into it. “There were a few posts that tripped our rule against hate speech,” he said. He did not say whether Facebook had checked on the authenticity or origin of the ban requests.
A month later, Facebook sent Gabriella Cseh, its head of public policy for Central and Eastern Europe based in Prague, to meet with Shymkiv, Bidenko and others in Kyiv.
Shymkiv said he told Cseh that the government believed Russia was using Facebook accounts with fake names to post fictitious, inflammatory news reports and engaging in online discussions to stir up political divisions.
Facebook needed to send a team to investigate, he said. Ukraine’s stability as a new democracy was at stake.
Bidenko said Cseh agreed he could email her the names of civic leaders who believed their accounts had been wrongfully banned.
According to Shymkiv, Cseh promised to review the cases, which Facebook says it did. Then she handed him a copy of its Community Standards policy, available online.
This appeals process worked well for about two years, Bidenko said.
But Cseh went silent, Bidenko said, since an email she sent him April 13, 2018, two days after Zuckerberg testified on Capitol Hill and public scrutiny of Facebook intensified. He figures she and the company became too busy with other problems to respond. But to his astonishment, she also unfriended him.
“I was like, what!? Why is Gabriella unfriending me?” he said. “Maybe I became a nuisance.”
Facebook declined to allow Cseh to be interviewed and didn’t respond to a question about why she unfriended Bidenko. In a statement they said, “Gabi has previously made it clear to Mr. Bidenko that she might not respond to every single one of his messages, but that doesn’t mean she isn’t escalating the issues he flags to the appropriate internal teams.”
In August, Zolotukhin met with Facebook officials and said he reiterated the same concerns. He sent them a list of pages that still needed verification marks and they complied soon thereafter.
Bidenko, Zolotukhin, hackers and journalists are eager to open their laptops and scroll through what they say are fabricated news that sometimes includes gruesome videos. “Phosphorus burns everything: Ukrainian militia is using illegal weapons,” said a repost of a YouTube video from 2017. “Executioners were harvesting internal organs for sale,” read a post from a Russian website.
More than 2,000 Ukrainians have been killed and an active war continues, making Russia’s continued clandestine attacks via Facebook an urgent national security matter.
Facebook recently posted a job for a public policy manager for Ukraine — based in Warsaw.
“Facebook is trying to stay on the sidelines” of the war between Ukraine and Russia, Zolotukhin said. “But now it is not about saying you’re for democracy. It’s about fighting for democracy.”