California lawmakers approve bills targeting social media content
SACRAMENTO, Calif. (AP) — California lawmakers on Tuesday sent Governor Gavin Newsom two groundbreaking bills intended to limit the downside of social media, as they faulted Congress for failing to act on the issue.
A first-of-its-kind measure would require social media companies to make public their policies for removing disturbing content and provide details on how and when they remove it.
The second bill would require companies that provide child-friendly online services to follow age-appropriate design code principles aimed at keeping children safe. This includes not profiling a child or using the child’s personal information in a way that could harm the child’s physical or mental health or well-being.
“The online world has created enormous opportunities, but also real and immediate threats to children, vulnerable communities and American democracy as we know it,” said Democratic Congressman Jesse Gabriel, author of the first draft of law.
“We believe California has a special obligation and a special opportunity to lead on these issues,” Gabriel added at a press conference Tuesday. “We are proud of our tech economy and know that many of the companies these bills would regulate are local California businesses. But with a dysfunction in Washington, DC, we think California needs to step up and lead.
Its measure would require companies to say how they regulate their own content under their social media terms of service. He stalled last year on free speech issues before clearing the Senate on a 33-3 vote and the Assembly, 48-0. He says it is “the intention of the legislature” that the state’s attorney general or a city attorney bring a civil action against violators.
While the measure enjoyed bipartisan support, Republican Senator Melissa Melendez in opposition feared it could be used to punish legitimate but unpopular content, especially since Attorney General Rob Bonta is a Democrat. progressive.
“I can’t help but wonder if this is in fact an attempt by the Attorney General to harass the citizens of California, especially those with an opposing view, and I don’t think not that it is appropriate for the state attorney general to get involved in any attempt to censor speech,” she said during the debate Monday night.
But Democratic Senator Thomas Umberg, who championed the bill, said the measure “doesn’t fundamentally censor content … If they don’t have a policy, they have nothing to report. If they have a policy, they must indicate how they implement this policy.”
Democratic Sen. Scott Wiener said the bill requested by the Anti-Defamation League is particularly important to the Legislature’s Jewish caucus, “given the rampant anti-Semitism on social media.”
Opponents include the California Chamber of Commerce, Computer and Communications Industry Association, Consumer Technology Association, Internet Coalition, Netchoice and TechNet.
A coalition of opponents said companies must already make their content moderation policies public, and the bill goes overboard by requiring them to disclose to the attorney general “sensitive information about how we implement policies, detect activity, train employees and use technology to detect content requiring moderation.
The second bill intended specifically to protect children from inappropriate online content was cleared by the Senate 33-0 also with bipartisan support, although seven Republicans did not vote. He cleared the Assembly, 60-0.
The measure “will represent a major positive step towards creating a global standard for the protection of young people online. It’s an aspiration I think we can all agree on,” said Democratic Senator Josh Newman, who carried the bill through the Senate.
It is inspired by a similar measure in the United Kingdom. The Chamber of Commerce and some tech industry associations also oppose it. A coalition including the Entertainment Software Association said the bill includes “a standard that is too inclusive and would capture many more websites and platforms than necessary”.
A third measure making its way through the legislature would require major social media platforms to publicly disclose statistics from mid-2023 on content that violates its policies that has been recommended or otherwise amplified by the platform’s algorithms.
Another more controversial measure failed to the Guardian’s Senate Appropriations Committee earlier this month after being strongly opposed by the influential tech industry. This would have subject some popular social media platforms like Instagram and TikTok to fines for using features they know can harm children.