If there was one thing lawmakers and social media companies agreed on in 2017, it was that no one—not companies or the government or the public—were well prepared for how foreign adversaries might use social media networks to influence the American public during an election year. Eight years later, that consensus may have been the highwater mark in efforts to actually combat that problem.
On Wednesday, Facebook’s parent company Meta will cease to support CrowdTangle, a data tool that allows researchers, journalists, and other observers to uncover disinformation and misinformation trends on the social network. Experts from a variety of organizations warn that the move, coupled with other decisions among social media companies to roll back data monitoring and trust and safety teams, will make it much harder to fight lies spread by hostile powers.
Meta announced last month that it would replace CrowdTangle with the “Meta Content Library,” a less powerful tool that will not be made available to media companies.
That will help China, Russia, and other autocratic countries that seek to sow political division in the United States, said Nathan Doctor, senior digital methods manager at the Institute for Strategic Dialogue.
“With these sorts of foreign influence campaigns, probably the biggest thing about them is to keep tabs on them. Otherwise, as we see, they can start to flourish. So as data access…dries up a bit, it becomes a lot more difficult in some cases to identify this kind of stuff and then you know, reactively deal with it, Doctor said on Monday during a Center for American Progress online event.
Meta’s decision followed similar moves at other social-media companies. After Elon Musk took over Twitter—now X—in 2022, he disbanded the team that watched for foreign disinformation. Snapchat and Discord have since trimmed their trust and safety teams by 20 to 30 percent, said Priyanjana Bengani, a computational journalism fellow at the Tow Center for Digital Journalism at Columbia University.
But Meta is far larger than X. Facebook has 3 billion monthly active users and Instagram, also owned by Meta, has 2 billion; while X has about 600 million. Shutting down CrowdTangle will deal a big blow to journalists and others looking to understand how disinformation is spreading, Davey Alba, a technology reporter for Bloomberg said on Monday.
And that’s particularly alarming right now, as the United States prepares for a presidential election that foreign actors are looking to sway with disinformation.
“It will make our work more difficult to do ahead of the U.S. election to have this tool be shut down,” Alba said.
U.S. lawmakers, too, are increasingly concerned. In July, 17 of them sent a letter urging Meta to reconsider its decision—in vain.
That shows that social media companies don’t feel much accountability to policy-makers, the press, or the publice, Brandi Geurkink, the executive director of the Coalition for Independent Technology Research, said Monday.
“In arguably the largest global election year ever, the fact that a company can…signal their intention to make such a decision and then have such a groundswell of opposition from civil society all around the globe, from lawmakers in the United States and Europe, from journalists, you name it, and continue to go ahead with this decision and not really respond to any of the criticism—that’s what I think is the bigger worrying piece,” she said.
It’s a far cry from the conversations that social media companies and lawmakers were having eight years ago after the revelation of a Russian campaign to influence a U.S. presidential election. In October 2016, officials from top social media companies appeared before Congress to offer mea culpas.
“This is the national-security challenge of the 21st century,” Sen. Lindsey Graham, R-S.C., said at the time.
Facebook’s General Counsel Colin Stretch said he shared that concern.
“In hindsight, we should have had a broader lens. There were signals we missed,” Stretch said in his testimony.
Geurkink said Meta’s CrowdTangle decision belies its promises to fight state-sponsored influence campaigns and other disinformation.
“I think what I do see is dishonesty with regard to the way that they are publicly messaging it,” she said. “In the last election cycle, they were training political parties, training NGOs on how to use this tool to do really good election monitoring work around the election. So it seems very much a bait-and-switch.”
Bengani said there will certainly be a lot more signals missed now.
“I think we’re almost at the point where it’s worse than what was in 2016 right now,” she said.
Back then, observers could track social-media trends with the Twitter and Reddit APIs—ways for researchers to access the network’s data—and other companies were funding efforts to moderate online conversation and to monitor for foreign influence.
Countries might enact laws to compel transparency from tech companies, but that’s likely to foster a patchwork of practices and perceptions around the globe, she said. The end result: it will become harder for anyone to know what’s real and what isn’t online. She called it a “weird, fragmented transparency ecosystem, which just makes tracking trends or research across geographies incredibly hard.”