Media platforms

Parents and young people say social media platforms can be more responsive to Singapore’s online safety needs

SINGAPORE: Parents and young people said on Thursday (July 21) that social media platforms could be more responsive to Singapore’s needs, amid growing concerns about online safety.

During a public consultation on online safety on Thursday evening, several participants pointed out that certain services offered by social media platforms could be better designed to help more users report harmful content or provide feedback.

One participant shared that when she tried to report harmful content on a social media platform, she encountered difficulties because it did not fit into any of the categories provided in the reporting tool.

Speaking to reporters on the sidelines of the group discussion with parents and young people on Thursday, Communications and Information Minister Josephine Teo also noted that awareness of user safety tools was uneven among attendees. .

“This highlights the need for more awareness of these security tools to be made available to users,” she said.

“Even among those familiar with these tools, there is a stronger desire for social media services to be more responsive to their individual reporting needs, as well as community conditions,” she added.

This is the third panel discussion that has taken place so far, as part of a month-long public consultation that was launched on July 13. About 60 people attended the discussion on Thursday.

The consultation seeks feedback on two sets of proposals announced by the Department of Communications and Information in June.

The first proposal is that designated social media services with significant reach or impact have system-wide processes to improve online safety for all users, with additional safeguards for young users under the age of 18. year.

This includes implementing community standards and content moderation mechanisms to mitigate users’ exposure to certain harmful content, and proactively detecting and removing child sexual exploitation and abuse material as well as terrorist content.

The second is that the Infocomm Media Development Authority (IMDA) can order any social media service to remove specified types of “flagrant content”. These include safety from sexual abuse, self-harm or public health. Areas of concern also include public safety and racial or religious discord or intolerance.


Source link