Thursday 18 August 2022 02:10 PM Reddit analysis reveals 16% of users publish posts that are 'toxic' trends now

Thursday 18 August 2022 02:10 PM Reddit analysis reveals 16% of users publish posts that are 'toxic' trends now
Thursday 18 August 2022 02:10 PM Reddit analysis reveals 16% of users publish posts that are 'toxic' trends now

Thursday 18 August 2022 02:10 PM Reddit analysis reveals 16% of users publish posts that are 'toxic' trends now

Reddit can be a fun website where users discuss niche topics with others who share their interests.

However a new analysis of over two billion posts and comments on the platform has revealed an alarming amount of hate speech, harassment and cyberbullying.

Computer scientists from Hamad Bin Khalifa University in Qatar found that 16 per cent of users publish posts and 13 per cent publish comments that are considered 'toxic'.

The research was conducted to discover how Redditors' toxicity changes depending on the community in which they participate.

It was found that 82 per cent of those posting comments change their level of toxicity depending on which community, or subreddit, they contribute to.

Additionally, the more communities a user is a part of, the higher the toxicity of their content.

The authors propose a way of limiting hate speech on the site could be to restrict the number of subreddits each user can post in.

Computer scientists from Hamad Bin Khalifa University, Qatar, found that 16 per cent of Reddit users publish posts and 13 per cent publish comments that are considered 'toxic'

Computer scientists from Hamad Bin Khalifa University, Qatar, found that 16 per cent of Reddit users publish posts and 13 per cent publish comments that are considered 'toxic'

WHAT DID THE STUDY FIND OUT ABOUT REDDIT USERS? 

16.11 per cent of Reddit users publish toxic posts.

13.28 per cent of Reddit users publish toxic comments. 

30.68 per cent of those publishing posts, and 81.67 per cent publishing comments, vary their level of toxicity depending on the subreddit they are posting in.

This indicates they adapt their behaviour to fit with the communities' norms.

Advertisement

The paper's authors wrote: 'Toxic content often contains insults, threats, and offensive language, which, in turn, contaminate online platforms by preventing users from engaging in discussions or pushing them to leave.

'Several online platforms have implemented prevention mechanisms, but these efforts are not scalable enough to curtail the rapid growth of toxic content on online platforms. 

'These challenges call for developing effective automatic or semiautomatic solutions to detect toxicity from a large stream of content on online platforms.' 

The explosion in popularity of social media platforms has been accompanied by a rise of malicious content such as harassment, profanity and cyberbullying.

This can be motivated by various selfish reasons, like to improve the perpetrator's popularity, or to enable them to defend their personal or political beliefs by getting involved in hostile discussions.

Studies have found that toxic content can influence non-malicious users and make them misbehave, negatively impacting the online community.

In their paper, released today in PeerJ Computer Science, the authors outline how they assessed the

read more from dailymail.....

NEXT PlayStation 5 Pro will be an 'enormous' jump in tech with 8K resolutions and ... trends now