A coalition of 10 States, led by Texas, has filed an amicus brief in the 11th Circuit Appellate Court supporting Florida’s law requiring Big Tech to
consistently apply content-moderation practices and disclosures to affected users.
The Texas law, in particular and on which Florida’s law was modeled, specifies that
…social media sites in question must…disclose their content management and moderation policies and create a complaint and appeals process. The new law also prohibits email service providers from impeding the transmission of email messages based on content.
So far, so good for the two laws, but not far enough for either.
These platforms’ moderation teams also must be required to advise the poster/communicator, in advance of any adverse action, that the team is contemplating such action. In that advance notice, the moderation teams must advise the poster/communicator which platform criterion or set of criteria that the moderation team believes is being violated, and how—in concrete, measurable terms—the team believes that violation(s) is occurring.
For instance, in the case of “might offend some,” that notice must specify the group or groups the team believes might be offended and how that offense might occur—vis., if the potential offense is along the lines non-inclusiveness, the team must specify precisely how the non-inclusion is believed to be occurring.
The team also must suggest alternative phrasings (yes, plural) and for each alternative explain how the team’s suggestion conveys the same message as the original.
This advance notice also must provide the name and business contact data of the moderation team lead and the name and business contact data of the platform Director or Senior Vice President overseeing the platform’s moderation function.
The appeal itself must go to an independent arbitration board agreeable to both the poster/communicator and the platform and at the platform’s sole expense.