Media platforms

Transparency on algorithms must be a conscious step for social media platforms: Koo co-founder


[ad_1]

Taking a transparent and consistent approach to algorithms is a conscious choice that must be made by every social media platform, and is “the right way forward” to building trust with users, said the co-founder and CEO from the Koo microblogging platform, Aprameya Radhakrishna.

While self-regulation should be the first step, separate instructions are needed in cases where platforms do not meet user expectations or violate standards, Radhakrishna said.

“It’s our choice to remain transparent and consistent, and that’s the added value we bring in the world of social media,” Radhakrishna told PTI, when asked about the raging debate over algorithmic bias on some platforms. .

Claims that Facebook’s system and algorithms are fueling hate speech and fake news have raised many concerns about the influence of algorithms and tools in amplifying harmful content and disinformation.

Following recent disclosures from whistleblower Frances Haugen, Facebook has drawn criticism for allegedly putting profit over public good and not doing enough to shed its ‘growth at any cost’ culture. which propelled its rise to capture 2.91 billion monthly active users globally, including over 400 million in India.

State Minister for Informatics Rajeev Chandrasekhar has categorically stated that no “algorithm” of any social media platform should violate the basic rights of Indians, and that laws and jurisprudence should constantly evolve to keep pace with the changing nature of the Internet.

In addition, the Ministry of Informatics is planning a major awareness campaign next year in the form of a dialogue with the public, consumer forums, universities, industry and others on the space in rapidly evolving online and what more needs to be done to ensure that the internet is open, secure and trustworthy.

Koo’s Radhakrishna said making algorithms transparent is a conscious choice for companies to make, and easy for all platforms, whether in microblogging or other forms of social media.

“If you want to be opaque about this then nobody will understand why I see a particular type of content and therefore charges will be laid… but as long as it is transparent and consistent this is the right way to go and a more great trust is established with the user as well as with the individual governments of each country, ”Radhakrishna explained.

India, the world’s second largest telecommunications market and largest consumer of data, is a key market for internet companies such as Facebook, WhatsApp and Twitter, given the large population, increasing adoption of the internet and smartphones. , as well as its explosive growth.

Amid growing cases of user harm and unsafe behavior on digital platforms, India enforced new rules on IT intermediaries earlier this year, with the aim of increasing the accountability of large tech companies, including Twitter and Facebook.

The new rules require social media platforms to remove any content flagged by authorities within 36 hours and put in place a robust complaint redress mechanism with an agent based in the country. Social media companies are required to remove posts depicting nudity or processed photos within 24 hours of receiving a complaint.

Large social media companies – those with more than 50 lakhs of users – are also required to publish a monthly compliance report disclosing details of complaints received and actions taken, as well as details of content proactively removed.

Last month, the government published a Frequently Asked Questions (FAQ) on Interim Guidelines aimed at answering questions internet and social media users may have about the scope of the new rules, the major changes they make. compared to previous provisions, the way in which the rules improve the safety of women. and children, and due diligence to be done through an intermediary, among others.

The long-awaited Standard Operating Procedure (SOP) around IT rules and intermediate standards, which will contain details of the appropriate agencies that will have the authority to issue takedown notices to platforms, is underway.

In May of this year, Koo was one of the first social media platforms to report that it had met the compliance requirements of the new guidelines for digital platforms.

Koo has seen massive growth in its user base in recent months, after Union ministers and Indian government departments approved the local microblogging platform; its user base has grown to over 15 million in the 20 months since its launch in March 2020.

Koo’s popularity in India has peaked amid the Indian government’s wrangling over Twitter and growing calls to expand the ecosystem of local digital platforms.

Founded by Radhakrishna and Mayank Bidawatka, Koo was launched last year to empower users to express themselves and engage on the platform in Indian languages. It supports multiple languages, including Hindi, Telugu, and Bengali, among others.

The multilingual microblogging platform was named one of the next five most popular products in the Asia-Pacific (APAC) region by The Product Report 2021 created by Amplitude, a product analysis and digital optimization company based in California.

“Our algorithm is transparent, we’re a timeline. If you follow 10 people and those 10 people post something in a particular order or timeline, they’ll all be shown to you from newest to oldest. It’s as easy as that, so there is no bias, ”Radhakrishna said.

Going forward, Koo wants to continue offering transparent choices to its user.

“… seeing your content feed based on timeline is an option. Seeing your feed based on who you have most affinity for, seeing your content based on a third way, in which you can see your contents.

“So I think the more social media companies give an option to the user, be transparent about what the algorithm is for the users, in a simple way… which is open to the user,” Radhakrishna said. .

[ad_2]
Source link