Meta Decides To Remove Face-Checkers From Facebook, Leading To Concern Over Radical Posts

Experts are expressing concerns that Meta’s decision to remove professional fact-checkers from Facebook could worsen what has been referred to as “boomer radicalisation” in the UK.

Even before the so-called “far-right riots” in England last summer, as described by Keir Starmer, alarm bells were sounding over the increasing vulnerability of older individuals to misinformation and radicalisation.

Research had already indicated that older people might be more susceptible to these risks compared to younger “digital natives.”

An analysis of defendants involved in the 2011 unrest, conducted by The Guardian, showed that suspects were generally older than those charged in previous disturbances. The study revealed that up to 35% of the individuals were aged 40 or older.

However, following Mark Zuckerberg’s announcement last week that Meta would replace professional fact-checkers with a crowdsourced system, and would also recommend more politically charged content, experts have raised new concerns regarding the potential risks of radicalisation on Facebook. The platform remains highly popular among older users.

Prof. Sara Wilford of De Montfort University, who leads the Smidge project (Social Media Narratives: Addressing Extremism in Middle Age), expressed her concerns about Meta’s shift.

“It’s clearly a retrograde step that comes with all sorts of risks,” she said. “While X’s crowdsourced ‘community notes’ approach might work, Facebook operates in closed groups or silos, making it more difficult for middle-aged users to discern truth from extremist content.”

The anti-extremism group Hope not Hate also voiced concerns that Zuckerberg’s decision could pave the way for far-right figures and groups, such as Tommy Robinson and Britain First, to regain access to Facebook.

Britain First had previously thrived on the platform before being banned, amassing 2 million likes, surpassing both Labour (1 million) and the Conservatives (650,000) at the time.

Although young men continue to make up the majority of crime perpetrators, discussions around boomer radicalisation were already taking place.

For example, Darren Osborne, who was 48 when he carried out a deadly terrorist attack at a mosque in Finsbury Park, was described by the judge as having been “rapidly radicalised” online.

Similarly, in 2022, Andrew Leak, aged 66, firebombed a Dover migrant center in a right-wing attack before killing himself, leaving behind an online history filled with racist content.

Meta

Hope not Hate also pointed out that, while other platforms like Telegram were used to incite extreme hate and plan actions, Facebook was often utilized by the far-right to create hyperlocal, targeted content.

In the last few years, Facebook groups focused on anti-migrant protests played a significant role in organizing attacks on asylum centers. These groups tend to attract older users, reflecting the broader demographic makeup of Facebook’s user base.

An Ofcom report from last year revealed that Facebook was the most popular social media platform, especially among older adults. It warned that older users were less likely to recognize fake social media profiles, increasing their vulnerability to misinformation.

Wilford’s research also highlighted that older Facebook users were more susceptible to believing false content without questioning it, primarily due to their trust in content presented like traditional news media.

“We’re talking about a generation that may look back on a life they feel didn’t meet their expectations, whether in their jobs or social conditions,” Wilford added. “But when they engage online, they find validation in echo chambers that reinforce their beliefs.”

In response to the spread of misinformation on Facebook groups focused on everyday topics, some councils have invested in training local community group moderators.

However, political shifts in the UK have transformed how many people experience Facebook, especially as users who originally joined for social reasons—like sharing family photos or neighborhood news—now encounter more extreme political content.

Brexit, Donald Trump’s 2016 election win, and the Covid-19 pandemic were key moments that prompted many users to engage with more extreme right-wing politics via Facebook, according to Dr. Natalie-Anne Hall, a lecturer at Cardiff University and author of Brexit, Facebook, and Transnational Right-Wing Populism.

“Facebook has become a central space for algorithmically driven encounters with harmful ideas. Meta should be doing more, not less, to address this issue,” Hall stated. “Zuckerberg’s comments and the company’s new stance will only fuel the radicalisation of those who already feel victimized and resent progressive views.”

When approached for comment on concerns about misinformation and extremism, Meta referred to a blog post stating that its “complex systems” for content management had “gone too far.”

Leave a Comment