Media platforms

Social media platforms to remove harmful content and add safeguards under the rules offered by S’pore

SINGAPORE – Social media platforms like Facebook, TikTok and Twitter will soon be legally required to implement security standards and content moderation processes to minimize the risk of users being exposed to harmful online content like the terrorist propaganda, under Singapore’s proposed new set of internet rules.

They will also need to provide additional safeguards for users under 18, including tools to help them or their parents minimize their exposure to inappropriate content such as sexual or violent videos, and unwanted interactions such as as bullying and online harassment.

Communications and Information Minister Josephine Teo announced some details of the proposed new rules in a Facebook post on Monday, June 20.

“There is a growing global movement pushing for better online safety, recognizing that with harm comes good when people engage on social media,” she said.

“Many countries have enacted or are in the process of enacting laws to protect users from harm online.”

Ms Teo said Singapore’s preferred approach to strengthening its online regulatory approach is to do so in a consultative and collaborative manner.

“That means learning from the experiences of other countries, engaging tech companies on the latest technological developments and innovations, and understanding the needs of our people.

“These will allow us to develop technologically feasible requirements that can be applied effectively and tailored to our purpose.”

At a press conference on Monday, the Ministry of Communications and Information (MCI) said it had been consulting with the tech industry since the beginning of the month and public consultations would begin next month.

The new Code of Practice for Online Safety and Content Code for Social Media Services aim to codify these standards into law and give authorities the power to take action against platforms that fail to meet the requirements. .

The codes should be added to the Broadcasting Act following the consultations.

If it is adopted, the infocom media development Authority (IMDA) will be empowered to order the social media services to deactivate access to online content harmful for Singapore users.

Examples of content that could be blocked under the new codes include live videos of mass shootings and viral social media ‘challenges’ that encourage young people to perform dangerous stunts like holding their breath until they die. ‘they pass out.

The codes will also take into account Singapore’s unique context and sensitive issues such as race and religion.

For example, this could cover incidents similar to a previous case where a man was accused of stoking racial tensions after posting racist tweets using the persona of a Chinese woman under the pseudonym Sharon Liew.

Other examples cited by MCI included a 2020 post by someone using a profile called “NUS Atheist Society” that described the Bible and Quran in an offensive way and a 2021 poll asking people to rank local female Muslim religious teachers. based on their sexuality. attraction.

The platforms will also be required to produce annual accountability reports which will be published on the IMDA website.


Source link