Facebook's under-fire algorithms led conservatives to QAnon theories and ...

Facebook's under-fire algorithms led conservatives to QAnon theories and ...
Facebook's under-fire algorithms led conservatives to QAnon theories and ...

Facebook's algorithms inundated conservative users with QAnon conspiracies and other far-right content, while flooding liberal users' news feeds with far-left posts and memes like 'Moscow Mitch' that claimed the Senate majority leader was a Russian asset, newly released internal documents reveal.

Researchers created dummy profiles in 2019 for two fictitious female users - a liberal 41 year-old woman from Illinois called Karen, and a conservative from North Carolina called Carol, who was the same age.

Within days of activating the conservative account, named Carol Smith, the Facebook researcher started seeing posts supporting QAnon and other far-right groups. The liberal account, named Karen Jones, began seeing posts about collusion with Russia. And the Indian account saw graphic content depicting violence against Muslims.

All the material was fed to the dummy profiles through recommended groups, pages, videos and posts.  

Facebook's algorithms flooded users with extremist content and conspiracy theories based on their political beliefs. Liberal users were flooded with far-left posts and memes like 'Moscow Mitch' that claimed the Senate majority leader was a Russian asset  (file photo)

Facebook's algorithms flooded users with extremist content and conspiracy theories based on their political beliefs. Liberal users were flooded with far-left posts and memes like 'Moscow Mitch' that claimed the Senate majority leader was a Russian asset  (file photo)

Facebook conducted the experiment to study how the platform recommended content to Americans on opposite ends of the political spectrum. The accounts only clicked on content recommended by Facebook's algorithms and found themselves locked in an echo chamber of extremist beliefs and inflammatory misinformation.

Facebook's research backs up whistleblower Frances Haugen's claims that the website's algorithm favored divisive content because it kept users coming back. 

The documents reveal that Facebook was aware of the power its algorithms held in leading users 'down the path to conspiracy theories' at least a year before the January 6 riot at the Capitol, which the tech giant is also accused of not doing enough to prevent.

Meanwhile, the documents also reveal how the platform also stoked violence among foreign countries in conflict in a similar way. A researcher created a third dummy account for a Facebook user in India, the social networks biggest market, and saw a slew of posts against Muslims and Pakistan amid the border crisis between the two countries.

Facebook's algorithms inundated conservative users with QAnon conspiracies and other far-right content (file photo)

Facebook's algorithms inundated conservative users with QAnon conspiracies and other far-right content (file photo)

Researchers created dummy profiles in 2019 to study how the platform recommended content to Americans on opposite ends of the political spectrum, according to newly released internal documents

Researchers created dummy profiles in 2019 to study how the platform recommended content to Americans on opposite ends of the political spectrum, according to newly released internal documents

The findings from the three dummy accounts were detailed among a trove of documents shared by whistleblower Frances Haugen, which were disclosed to the U.S. Securities and Exchange Commission and provided to Congress in redacted form by Haugen's legal counsel. 

Facebook's issues in India reveal that the platform causes even more potential damage in countries where it has less resources and insufficient expertise in local languages to gauge what constitutes hate speech or misinformation. 

India is the company's largest market, but the platform said it only trained its A.I. systems on five, adding that it has some human reviewers for some others, the New York Times reported. But the Facebook report said that material targeting Muslims 'is never flagged or actioned.'

The researcher running the Facebook dummy profile in India wrote in a report that year, 'I've seen more images of dead people in the past 3 weeks than I've seen in my entire life total.'

She described how bots and fake accounts fanned the flames during the country's 2019 election. She saw a number of graphic posts as violence raged in Kashmir, the site of a long-running territorial dispute between India and Pakistan.

A third dummy profile created for an Indian user saw a slew of posts against Muslims and Pakistan amid the border crisis between the two countries. Above is an Indian fighting jet that was reportedly shot down by Pakistan in 2019

A third dummy profile created for an Indian user saw a slew of posts against Muslims and Pakistan amid the border crisis between the two countries. Above is an Indian fighting jet that was reportedly shot down by Pakistan in 2019

The researcher running the Facebook dummy profile in India wrote in a report that year, 'I've seen more images of dead people in the past 3 weeks than I've seen in my entire life total'

 The researcher running the Facebook dummy profile in India wrote in a report that year, 'I've seen more images of dead people in the past 3 weeks than I've seen in my entire life total'

One post circulating in the groups she joined depicted a beheading of a Pakistani national and dead bodies wrapped in white sheets on the ground.

The posts were unprompted and the user was flooded with propaganda and anti-Muslim hate speech following retaliatory airstrikes Indian Prime Minister Narendra Modi, campaigning for re-election as a nationalist strongman, unleashed against Pakistan.

PREV Determined gypsy travellers set off for historic Appleby Horse Fair - ... trends now
NEXT Female teacher, 35, is arrested after sending nude pics via text to students ... trends now