Media platforms

More State Content Moderation Laws Coming to Social Media Platforms | Coie Perkins

California and New York recently passed laws aimed at changing the way “social media platforms” and “social media networks” (both of which are loosely defined terms) design and report their practices. content moderation. In doing so, they become the latest states, after Texas and Florida, to dabble in platform regulation.

New California laws require social media platforms to provide detailed terms of service and details about their content moderation practices in transparency reports (AB 587), create content policies on illegal distribution of controlled substances ( AB 1628) and indicate whether they have reports of violent threat mechanisms (SB 1056). New York law requires social networks to provide hateful conduct policies and reporting mechanisms (S.4511-A/A.7865 or New York law).

Constitutional challenges are expected in both states. In the meantime, service providers who may be subject to these new national laws can refer to an overview of their requirements and some practical steps to consider below.

Affected Suppliers: California

California law defines a social media platform as an Internet service or application that meets the following two criteria:

  • The service or application offers social interaction features. These are online services or applications that allow users to (1) create a public profile, (2) create a list of shared connections they have with other users, and (3) create or post content that others can view (such as in chat rooms, on message boards, or on home or landing pages that display content created by other users).
  • A “substantial function” of the service or application is to connect users and allow them to interact socially. The laws do not define substantial function, although the definition excludes companies that only allow employees and affiliates of an organization to communicate internally, as well as any platforms that only allow email or direct messaging services. .

While the largest and most prominent platforms may generate the most scrutiny, these broad definitions may subject others, including game companies and online marketplaces with significant social components, to scrutiny. investigation or application. However, small platforms that do not exceed certain thresholds are excluded from the laws. AB 587 and AB 1628 exclude platforms that generated less than $100 million in gross revenue in the previous calendar year, and SB 1056 excludes platforms with less than 1 million discrete monthly users .

Under the three California statutes, “content” is defined as statements, comments, or media created by and for users to interact with, share, and post with. The laws exclude information uploaded to a platform solely for the purpose of cloud storage, file transmission, or file collaboration.

AB 587: Transparency of content moderation

AB 587 requires extensive transparency of content moderation. Social media platforms must (1) publicly post terms of service and (2) submit semi-annual transparency reports to the California Attorney General, which will be made public on the Attorney General’s website.

Notable requirements

  • Terms of use. Platforms should create and publicly post terms of service that outline violations of the terms and how users can report infringing content. Platforms must also disclose the company’s commitment to respond within a set timeframe and provide contact information for questions.
  • Rules and application data related to specific content categories. Transparency reports should include information about whether the platform maintains content policies on hate speech or racism, extremism or radicalism, disinformation or misinformation, harassment, and foreign political interference. Reports should also include data on how content was treated under these policies.
  • Detailed descriptions of content moderation practices. Platforms must disclose their content moderation practices, including methods of enforcement, how the platform handles user reports, and, if applicable, the languages ​​in which the platform offers functionality product but not terms of service.

Calendar and app

Platforms must submit their first transparency reports to the California Attorney General by January 1, 2024 and cover content from Q3 2023. The second report is due by April 1, 2024, containing data from Q4 2023 After April Since October 1, 2024, platforms must submit semi-annual reports by October 1 and April 1 each year. These semi-annual reports should contain information for the first two quarters of the current fiscal year and the last two quarters of the previous fiscal year, respectively.

Any platform that fails to post its Terms of Use, submit reports to the Attorney General, or materially omit or misrepresent any information will be in violation of this law. “Significant [civil] penalties sufficient to induce compliance with this law” may be granted by action brought by the Attorney General or certain city attorneys, although damages may not exceed $15,000 per violation per day. Courts will check whether a platform has made a reasonable and good faith attempt to comply with the law before determining penalties.

AB 1628: Illegal distribution of controlled substances

AB 1628 requires platforms to create and publicly post a policy statement that addresses the use of the platform for the illegal distribution of controlled substances.

Notable Requirements

  • Illegal Substances Content Policy. In addition to setting out the platform’s policy on the illegal distribution of controlled substances, the policy should include a description of the platform’s practices in place to prevent the posting or sharing of such content. Platforms may exclude any information that they believe may inhibit their ability to identify prohibited user content or activity or otherwise put users at risk.
  • Reporting Policies. Platforms should share the link to their mechanism for reporting illegal or harmful content or behavior, if one exists.
  • Educational links. Platforms must post a link to government public health resources for mental health and addictions education.
  • Government requests. Platforms should share their policies for responding to law enforcement requests such as subpoenas or warrants.

Calendar and app

This law does not provide for any specific enforcement mechanism. It is supposed to be in effect from January 1, 2023 to January 1, 2028.

SB 1056: Violent Messages

SB 1056 requires platforms to “clearly and prominently” indicate whether they have mechanisms in place to flag “violent messages” against users and non-users. It also creates the possibility of legal action for the users mentioned in these messages.

Notable Requirements

  • Mechanism for reporting and deleting “violent messages”. A violent message is content that contains a genuine threat against a specific person and is not protected by the First Amendment. An unprotected “genuine threat” is one that is explicitly intended to make the recipient fear real harm. If a platform has a mechanism for reporting violent messages, it should state this clearly and prominently and provide a link to it. Platforms with such mechanisms must remove reported content within 48 hours or face legal action as described below.
  • Lawsuit. If a person is, or reasonably believes to be, the target of a violent post on a social media platform, they may bring an action in a California court to seek removal of the violent content and any related violent content. Individuals can request these orders at any time if a platform does not have a reporting mechanism. Platforms with a reporting mechanism will have 48 hours to remove the post before the court rules on the takedown request. Courts may also award reasonable court costs and attorneys’ fees to successful plaintiffs.

Calendar and app

SB 1056 will take effect on January 1, 2023. As noted above, this law provides enforcement through private actions asking the courts to order the removal of violent messages.

Suppliers affected: New York

New York law, passed in June 2022, requires social media networks that operate in New York to provide and maintain a mechanism for users to report hateful conduct. The law defines social media networks as service providers that operate internet platforms for profit-making purposes designed to allow users to share content with other users or to make it available to the public. Unlike California laws, New York law does not limit its application to networks that meet certain monthly revenue or user thresholds.

Notable Requirements

  • Mechanism for reporting hateful conduct. New York law requires social networks to provide a mechanism for reporting hateful conduct that is clear and accessible to users through the network’s website and app. Users should receive a direct response from the social media network to each report submitted. The law broadly defines hateful conduct as using a social media network to defame, humiliate or incite violence against protected classes.
  • Hateful Conduct Policy. Social media networks should have a clear and concise policy that outlines how the network will respond to and handle reports of hateful conduct.

Calendar and app

This law will take effect on December 3, 2022 and will be enforced by the New York Attorney General. Social media networks that knowingly fail to comply with the law may be subject to a civil fine of up to $1,000 per day.

Next steps for online service providers

For online service providers that allow users to share content, here are some steps to consider:

  • Determine if the service is covered by one or more of these new social media laws. For California, this should include consideration of whether the social interaction features of the service constitute a substantial function of the service.
  • If the service is in scope, assess company-specific obligations and risks and develop a compliance plan.

A plan for compliance with these laws can benefit from taking into account the overlapping obligations arising from a number of global online security laws. Companies that may be affected should consult an experienced lawyer to understand their obligations and how they may intersect with related global obligations.

[View source.]


Source link