Media platforms

Social media platforms will have to block harmful content

Facebook, TikTok and Twitter apps are seen on a smartphone in this illustration taken July 13, 2021. *(Photo: REUTERS/Dado Ruvic)

SINGAPORE – Social media platforms will have to comply with instructions from the Infocomm Media Development Authority (IMDA) to take action against harmful content, such as that which promotes sex or self-harm, as part of a proposed set of rules to protect users, the communications minister said. and Information Joséphine Teo said Monday, June 20.

Meanwhile, some social media platforms will also have to provide mechanisms for users to report harmful or inappropriate content, such as that involving self-harm or intimate videos, as part of another proposed set of rules to protect people. users.

IMDA engages tech companies on “Content Code for Social Media Services” and “Code of Practice for Online Safety,” Teo said on his facebook page.

A public consultation exercise will begin next month.

According to media reports, Singapore’s Ministry of Communications and Information (MCI) gave further details on the two proposed codes of practice at a press conference on Monday.

In the proposal, actions that the IMDA could order social media platforms include disabling access to certain content for users in Singapore, or preventing specific online accounts on these platforms from interacting with or to communicate content to users in Singapore.

MCI also said the proposals only cover social media platforms that allow the posting of online content aimed at obtaining interactions and links, and will not include messaging apps.

In his post, Teo said online security is a growing concern and acknowledged that social media platforms have put measures in place to keep users safe over the years.

“Nevertheless, more can be done given the evolving nature of harm on these platforms and the socio-cultural context of our society,” she said.

In his article, Teo cited a survey by the Sunlight Alliance for Action (AfA) in January, which found that one in two respondents experienced “online harm”, with teenagers and young adults making up the majority of people. harmed.

Calls are being made to develop “more support mechanisms and infrastructure for victims of online harm”, she said.

She also noted that many countries have enacted or are in the process of enacting laws to protect users online.

“Singapore’s preferred approach to strengthening our online regulatory approach is to do so in a consultative and collaborative manner,” she said.

“That means learning from the experiences of other countries, engaging tech companies on the latest technological developments and innovations, and understanding the needs of our people.

“These will allow us to develop requirements that are technologically feasible, can be applied effectively and are suitable for our purpose,” she added.

Stay informed on the go: Join Yahoo Singapore’s Telegram channel at

Source link