Singapore to impose content moderation rules on social media platforms
Singapore will soon have a new code of practice for online safety and a content code for social media services, requiring social media platforms to implement community standards and content moderation processes, to tackle online harm.
These will focus on three specific areas:
- child safety
- user reports
- responsibility for the platform.
Additional safeguards will be required for users under the age of 18. Platforms will need to ensure that users and their parents have means to detect inappropriate content online, in order to minimize young people’s exposure to such content.
Failure to comply with these codes will empower the Infocomm Media Development Authority (IMDA) to take regulatory action against violating players, such as ordering them to disable access to harmful content by users in Singapore. Platforms could also be required to produce annual accountability reports to be posted on the IMDA website.
Public consultations on these codes are expected to begin in July and, once finalized, will be published in accordance with Singapore’s Broadcasting Act.
Online security is a growing concern and Singapore is not alone in seeking stronger safeguards for its people. Over the years, social media services have implemented measures to keep users safe on their platforms. Nonetheless, more can be done given the evolving nature of harm on these platforms and the socio-cultural context of our society. Minister of Communications and Information Josephine Teo