The concentration of engagement on a small number of users has been a frequent topic for researchers within Facebook for at least four years, as it became more and more clear that this phenomenon favours disinformation and rewards very few profiles (usually the big ones).
Still, despite repeated alerts coming from its own analysts, internal data leaked in the Facebook Papers clearly show that the concentration of reshares is still, to this day, very high on the platform, weighing against actions to fight fake news and harmful content.
Data from a publication in the company's internal network from April 2021 show that between February 22nd and March 21st only
1.3% of the
2.82 billion active users during the period (
37 million profiles) accounted for half of the reshared posts on the platform.
Compare this number with the
978 million (
34.6% of total) profiles that had at least one original post shared during this period. This means that a very small number of people are concentrating a disproportionately big chunk of engagement on Facebook (which is now called Meta).
It matters because...
- This concentration has implications on the platform's integrity, as it is known based on other documents, that the content which goes viral has a bigger chance of being harmful
- Shows a contradiction between the business model and the platform's democratization ideals, which aimed at giving everyone a voice
"This data point is relevant to a lot of ongoing discussions. I'm particularly interested in how we might better democratize virality on Facebook, by rewarding broad participation", wrote the researcher who shared the data in the internal employee network.
This is not news for Facebook.
In addition to the many internal documents mentioning concern with the concentration of reshares in the past few years, an open research sponsored by the company in April 2017 already hinted towards the topic.
"While likes and comments provide feedback, it is resharing that has the potential to spread information to millions of users in a matter of few hours or days", wrote the authors.
Still, the platform seems to have a hard time acknowledging this in the open. In a statement sent by email to Núcleo, Facebook said only that:
"The concentration of content sharing by users is not a new behaviour on social media. However, we don't want the sharing of content that is potentially harmful and low quality. We take a series of measures to mitigate the viral circulation of these content adjusting the distribution signals on Feed".
The statement did not directly address any of the questions sent by Núcleo:
- Does Facebook see as a problem the concentration of a large amount of shares under a small number of users?
- Can this concentration of many reshares under a small amount of users be considered artificial engagement (that is, coordinated campaigns, engagement farms, etc)?
- What are Facebook's policies to increase other people's participation in the debate and to avoid having it concentrated under a few?
- Can reshare concentration in posts with disinformation increase the reach of harmful content? What does the platform do when a post with disinformation receive many reshares from a few users?
THIS IS THE DESIGN
Nothing about this happens in a vacuum. Facebook's own computational infrastructure operates under a logic of maximum influence that rewards (usually with reach) those who concentrate the production of content, as researcher Lori Regattieri explained to Núcleo.
She is a PhD candidate in Communication and Culture at the Federal University of Rio de Janeiro (UFRJ), in Brazil, who follows the work of Facebook's data scientists and researchers way before the Facebook Papers.
"Those who have more resources to mobilize and concentrate influence are more rewarded with visibility by the platform itself" - Lori Regattieri, UFRJ researcher
"This is the reward circuit that Facebook gives to those who interact with these conditions of platform's own infrastructure that bring returns to Facebook".
It is, after all, a vicious circle: Facebook appreciates and rewards those who act to maximize influence, and the concentration resulting from this maximization generates, in turn, returns for the company.
Regattieri studies in depth Facebook and Twitter's digital infrastructures and how these platforms monitor, process and channel data for the demands of their business models – while at the same time create conditions for high concentration of influence. The companies are constantly challenged with the social cost of this process.
"Facebook is trying to understand how a content can go viral without tiring people, by generating a type of collective satisfaction", said Regattieri to Núcleo.
A practical effect of this concentration is what researchers call "homophilic bubbles", which can lead to radicalization because it keeps people enclosed with others that think very similarly. For the researcher, it was predictable that this would be a consequence of this infrastructure.
"This is a flaw of Facebook's infrastructure that, as opposed to democratizing virality, keeps people inside of their bubbles", explained Regattieri.
In an email, a Facebook spokesperson said:
"We have been making product changes to contain desinformation. Since May 2021 we have shown users a pop-up in Facebook alerting them when they try to 'like' a Page that frequently shares information that was flagged as false ou misleading by Meta's fact-checking partnering agencies. We also started to punish individual profiles that frequently distribute desinformation, reducing the reach of all the publications made by those people in their News Feeds".
IN THE DEPTHS OF RESHARE
An analysis done by Núcleo on the Facebook Papers leaks found dozens of documents that deal with the concentration of shares on the platform.
One of these studies analyzes specifically the role of deep reshares (when a post is shared way beyond the contact network of a profile) and disinformation.
"Our data reveal that misinformation in general relies much more on deep reshares for distribution than major publishers do", wrote a Facebook researcher in an analysis posted on the company's internal network in April 2019, alerting that the data are more trustworthy in the United States, but "holds up for other countries".
"We've found that when a user sees a deep reshare of a link or a photo, they are 4 times more likely to be seeing misinformation compared to when they see links or photos on News Feed in general" - Facebook researcher in 2019 Analysis
As an example, the author mentions a viral video that falsely alleges that Nancy Pelosi, one the main leaders of the United States' Democratic Party, was drunk. In total, 45% of the views came from people who did not follow the original profile who posted the video (a comedian).
Another analysis, also around April 2019, concludes that in Facebook only
1.6% of all the active users in a month contributed with
half of the reshared posts -- in line with the data mentioned above for 2021, two years later.
"Older users reshare more, especially from misinformation pages", wrote the author. "Women reshare more in general, but men reshare more from misinformation pages".
And yet another document, from November 2019, also shows concerns around this topic, more specifically about the concentration of civic voice. In the first paragraph, the analysis said that, in Brazil,
3% of the users created
33% of the "civic content" - usually publications concerning politics.
Another document from 2019 alerts that posts shared within groups have a higher probability of generating user feedback with integrity concerns. In addition to the low quality content, this research also identified repetition problems (which inflates even more the reach of certain posts).
"Maybe most alarmingly, we see a stat sig [statistical significance] increase in both Newsfeed and Groups Mall repetitiveness since a lot of identical posts get shared in many different groups . This could make the platform feel spammy", wrote the authors.
HOW WE DID THIS
The information in this story is part of the documents revealed to the U.S. Securities and Exchange Commission (SEC) and given to the United States Congress in redacted form by Frances Haugen's legal team. The redacted versions received by U.S. Congress were reviewed by a consortium of news outlets, which became known as the Facebook Papers. Núcleo Jornalismo had access to the documents.
These documents were first reported on by the Wall Street Journal, which named the series the Facebook Files.