close
close

Social media companies should turn off algorithms that provide “toxic” content

0

Social media companies should be forced to shut down the technology that makes them millions from advertising, according to an advocacy group.

X, formerly known as Twitter, and Meta, the owner of Facebook and Instagram and other social media sites, have come under criticism in recent weeks for their role in the unrest across the UK.

In addition, a death threat is said to have been made on Instagram against Irish Prime Minister Simon Harris and his family.

Meanwhile, more than 240 local activist organisations have signed a letter calling on the government to “turn off harmful engagement-based recommendation systems by default.” (Photo by Anna Barclay/Getty Images)

Meanwhile, more than 240 local activist organizations have signed a letter calling on the government to “turn off harmful engagement-based recommendation systems by default.”

Recommendation systems, or algorithms, are developed by social media companies to ensure that users see more content they like in their feed, such as on their Facebook homepage or TikTok's For You page.

Many activists argue that this type of content sharing has helped fuel the unrest in Britain in recent weeks.

Young people standing in circle and using mobile phones outdoors. Unknown teenage friends watching social media content on smartphone apps. Technology lifestyle concept.
Popular or controversial content is more likely to be shared; the recommendation system automatically shares it with more people if it thinks it is popular. Image: Getty

Popular or controversial content is more likely to be shared; the recommendation system automatically shares the content with more people if it thinks it is popular, without realizing that it may be a call to participate in uprisings.

The activist group Hope And Courage has now called on the government to force social media companies to automatically switch off this recommendation system.

A spokesperson said: “To stop the spread of hate, as an immediate emergency measure, platforms must turn off toxic engagement-based recommendation systems by default and respond quickly to harmful content on their services. People should decide what they want to see, not the algorithms of big tech companies.”

A spokesman for Coimisiún na Meán, Ireland's media regulator, said: “As part of our online safety framework, Coimisiún na Meán will address the potential dangers of recommender systems through the implementation of the Digital Services Act.”

“Article 28 of the Digital Services Act requires online platforms to take measures to ensure a high level of privacy and safety for minors. This includes ensuring that their recommendation systems do not harm children.”

He noted that companies like X and Facebook are required to provide an option for their recommendation systems that is not based on profiling, but he said this provision does not oblige them to make this option the default option.