Media platforms

Social media platforms must abandon algorithmic secrecy


Algorithmic advancements have spurred quantum leaps in a myriad of human endeavors – including medical devices, models for climate change, financial trading systems, GPS mapping, and even online dating.

But while these algorithmic forces have a beneficial impact on our daily lives, they are often inaccessible and mysterious to the average citizen.

By piloting our social media platforms, providing us with daily news, family updates, and friend suggestions, the algorithms have been keeping us enthralled for quite some time. Yet most of us do not have the training or the skills to understand how these systems affect us and the protocols that govern them.

Frédéric Mostert and Alex Urbelis

We have to believe others on how these dark systems work and what they are made of.

the testimonials from technology executives summoned before the UK Parliament or US Congress to explain algorithmic data processes or explain data breaches, tells us little about how their algorithms actually work.

And the algorithms animating our news feeds on social networks are often protected as trade secrets and do not appear in a publicly accessible register such as the United States or United Kingdom Patent Office.

“Patents operate on the basis of sufficient disclosure of an inventor’s scientific innovation for the benefit of society,” says Tanya Aplin, our colleague at King’s College. “Manufacturing secrets, on the other hand, preserve the confidentiality of the know-how of formulas and technical developments.

This balance between total disclosure and secrecy is at the heart of the debate around the use of algorithmic forces.

Critically, these algorithmic systems have no form of community review. Also, an epistemological enigma is added to the problem: we do not know who knows how these algorithms work.

Welton Chang, chief technology officer at Human Rights First, an advocacy group, says: “In the labyrinthine structures of social media companies, it is doubtful that there is a department or team with full visibility of the media. a platform’s secret black box of algorithms.

Algorithmic and robotic content has, in large part, assisted and fed electoral interference, instigated national rebellion and facilitated online extremism.

Absolute and uncontrolled, algorithm-activated platforms wield tremendous power over billions of citizens around the world.

Frank Pasquale, writing about secret algorithms in his book The black box society, says: “As good as absolute secrecy may be as a business strategy, it is doubtful that public policy should encourage it.”

The protection of algorithmic “secret sauces” via trade secret law has become de rigueur in recent years. Ironically, the antithesis of this approach – the open source algorithm – may not only be the saving grace of these algorithms, but a possible antidote to secrecy.

By disclosing their formulas for the benefit of society, open source algorithms allow a cross-section of professionals to examine the fundamentals at play.

Security researchers can determine whether our personal data has been put at risk during algorithmic processing. Human rights organizations can help prevent violations of our fundamental freedoms. Academics can dig into these systems for bias.

But, until we have a basic understanding of how social media algorithms use our personal data, platforms will still be able to resist liability and regulatory efforts will be too imprecise to have an impact. .

“Users have a right to know what input is being made both to the algorithms that choose their content and to those used to moderate their content,” says Jillian York, author of Silicon values.

While the underlying algorithms at play in apps remain opaque and inaccessible, a further step in that direction is Apple’s App Tracking Transparency program. This feature returns some measure of control over personal data to users which can prevent tracking to third-party apps and websites.

Full disclosure and transparency, as opposed to secrecy, are the foundations of liberal democracies. With platforms inextricably linked to our political and democratic processes, it is time to abandon secrecy and mystery in favor of transparency.

Users of social media platforms should be able to draw their own conclusions about the place of the digital algorithm in their life.

Open, transparent, fair and accountable algorithmic decision-making processes should form the cornerstone of the operating principles defined for and by platforms and decision-makers.

Frederick Mostert is Professor of Intellectual Property Law Practice at King’s College London and Fellow of the Digital Scholarship Institute.

Alex Urbelis is a partner of Blackstone Law Group LLP and a member of the Technology Advisory Board of Human Rights First

Letter in response to this article:

Big Tech auditors need access to algorithms / By Rosie Sharpe, Global Witness, Copenhagen, Denmark

Source link