Media platforms

Transparency on algorithms must be a conscious step for social media platforms: Koo co-founder


Taking a transparent and consistent approach to algorithms is a conscious choice that must be made by every social media platform, and is “the right way forward” for building trust with users, the platform’s co-founder and CEO. Koo microblogging form, Aprameya Radhakrishna, mentioned.

While self-regulation should be the first step, separate instructions are needed in cases where platforms do not meet user expectations or violate standards, Radhakrishna said.

“It’s our choice to remain transparent and consistent, and that’s the added value we bring in the world of social media,” Radhakrishna told PTI when asked about the raging debate over social media. algorithmic biases on some platforms.

Claims that Facebook’s system and algorithms are fueling hate speech and fake news have raised many concerns about the influence of algorithms and tools in amplifying harmful content and disinformation.

In the wake of whistleblower Frances Haugen’s recent revelations, Facebook has come under fire for allegedly putting profit over public good and not doing enough to shed its “culture of growth at all costs” which propelled its rise to capture 2.91 billion monthly active users globally, including over 400 million in India.

State Minister for Informatics Rajeev Chandrasekhar has categorically stated that no “algorithm” of any social media platform should violate the basic rights of Indians, and that laws and jurisprudence should constantly evolve to keep pace with the changing nature of the Internet.

In addition, the Ministry of Informatics is planning a major awareness campaign next year in the form of a dialogue with the public, consumer forums, universities, industry and others on the space in fast-paced online and what more needs to be done to make sure the internet is open, secure and reliable.

Koo’s Radhakrishna said making algorithms transparent is a conscious choice for companies to make, and easy for all platforms, whether in microblogging or other forms of social media.

“ If you want to be opaque about this then no one will understand why I see a particular type of content and therefore charges will be made … but as long as it’s transparent and consistent this is the way to go. and greater trust is established with the user as well as with the individual governments of each country, ”Radhakrishna explained.

India, the world’s second largest telecommunications market and the largest consumer of data, is a key market for internet companies such as Facebook, WhatsApp and Twitter, given the large population, increasing adoption of the internet and smartphones. , as well as explosive growth.

Amid growing cases of user harm and unsafe behavior on digital platforms, India implemented new rules on IT intermediaries earlier this year, with the aim of increasing the accountability of large tech companies, including Twitter and Facebook.

The new rules require social media platforms to remove any content flagged by authorities within 36 hours and put in place a robust complaint redress mechanism with an agent based in the country. Social media companies are required to remove posts depicting nudity or processed photos within 24 hours of receiving a complaint.

Large social media companies – those with more than 50 lakhs of users – are also required to publish a monthly compliance report that discloses details of complaints received and actions taken, as well as details of content being proactively removed.

Last month, the government published a Frequently Asked Questions (FAQ) on interim guidelines aimed at answering questions internet and social media users may have about the scope of the new rules, the major changes they provide compared to past provisions, how the rules improve the safety of women and children, and due diligence to be done through an intermediary, among others.

The long-awaited Standard Operating Procedure (SOP) around IT rules and intermediate standards, which will contain details of the appropriate agencies that will have the authority to issue takedown notices to platforms, is underway.

In May of this year, Koo was one of the first social media platforms to report that it had met the compliance requirements of the new guidelines for digital platforms.

Koo has seen massive growth in its user base in recent months, after Union ministers and Indian government departments approved the local microblogging platform; its user base has exceeded 15 million in the 20 months since its launch in March 2020.

Koo’s popularity in India has peaked amid the Indian government’s wrangling over Twitter and growing calls to expand the ecosystem of local digital platforms.

Founded by Radhakrishna and Mayank Bidawatka, Koo was launched last year to empower users to express themselves and engage on the platform in Indian languages. It supports multiple languages ​​including Hindi, Telugu, and Bengali.

The multilingual microblogging platform was named one of the next five most popular products in the Asia Pacific (APAC) region by The Product Report 2021 created by Amplitude, a product analysis and digital optimization company based in California.

“ Our algorithm is transparent, we are a timeline. If you follow 10 people and those 10 people post something in a particular order or timeline, they will all be shown to you from newest to oldest. It’s that simple so there is no bias, ” Radhakrishna said.

Going forward, Koo wants to continue offering transparent choices to its user.

” … seeing your timeline-based content feed is an option. View your feed based on who you have the most affinity with, view your content based on a third way, in which you can view your content.

“So I think the more social media companies give the possibility to the user, to be transparent about what the algorithm is for the users, only… which is open to the user,” Radhakrishna said. .

(This story was not edited by Devdiscourse staff and is auto-generated from a syndicated feed.)

Source link