Panel Subject: Platform Content Moderation and Liability
This page is for coordination among the panel team to openly discuss the topics that will be covered under the subject of the "Platform Content Moderation and Liability." The page includes the relevant survey results, Panel Guidelines and a section for the panel team to discuss in the comments.
|Collaborative Planning Document|
|Join the Team|
The following topics were surveyed under this subject and the table provides the performance of each:
|#||Question||Important (3)||Neutral (2)||Not Important (1)||Weighted Average||Responses|
|7.1||To preserve free expression and an open Internet, should federal law protect a platform against intermediary liability for conduct or content posted by a user?||54||15||6||2.64||75|
|7.2||Should the government increase regulation of online platforms, or allow platforms to determine their own concept of corporate responsibility?||48||25||4||2.57||77|
|7.3||How do we separate regulation of data/info from regulation of content and speech?||44||22||9||2.47||75|
|7.5||Is the US losing its initial advantage in Internet expertise and innovation?||42||20||11||2.42||73|
|7.4||How can Internet platforms protect and promote positive counter-speech?||31||33||11||2.27||75|
Panel teams should use this process to discuss the panel and get as close as possible to consensus on the following items by April 11.
- Identify a concrete subject for the panel based upon discussion and rough consensus. The subject and process must take into account the IGF-USA Principles, and the survey response topics that the subject is based upon.
- Panel teams should also consult the Topics Suggested under Other during the survey, especially comments that relates directly to a topic that performed well on the survey.
- Assign team leader(s) and a representative to interface with the steering committee and provide ongoing up to date information to secretariat.
Team Lead(s) - April 25
Draft Panel Description - May 2
Online platforms delete comments, terminate accounts, and impose transparency requirements. They look at issues of legality, politics, safety, and obscenity when taking these steps. While these actions may appear arbitrary they are often based on complex systems of content moderation. This panel illuminates the challenges of creating and dispassionately enforcing these content moderation systems through a discussion by experts and then asking the audience to engage in their own mock content moderation decisions.