Day 4️⃣ | Session 7: Managing online spaces: Content moderation and intermediary liability
Date: Thursday, 23 September 2021 [3:30 am - 5:30 am UTC]
Panel
The private sector, and more specifically social media platforms wield great power and decision making on what content gets to stay online and what is taken down. The increasing quantity and diversity of sources of online speech hosted by internet platforms coupled with the dangerous impact and consequences caused by the spread of misinformation and hate speech has led to a growing inclination among governments across the globe to demand more aggressive intervention in filtering the content they host. However, these moves have been challenged by experts as having a chilling effect on free speech, possibly amounting to censorship while also imposing unreasonable expectations on platforms.
This session, which will be in the form of a panel discussion, will uncover the approach taken by countries and the private sector in the region to content moderation and intermediary liability covering regulatory tools used by governments, community standards established by private actors, jurisprudence laid down by courts, and concerns identified by scholars. The core objective of this session is for participants to unpack how individual rights and public interest are dealt with by platforms under the law.
Key points of discussion:
- What is content moderation?
- What are the kinds of speech subject to content moderation?
- What are the laws that regulate content moderation and provide for intermediary liability?
- What are the key distinctions in the laws from the Southeast Asian countries?
- What laws and policies exist on content moderation for platforms?
- What are the prominent judicial pronouncements on content moderation and intermediary liability?
Reference Materials:
- Table 3 - Laws and regulations governing the ICT ecosystem in Southeast Asian countries
- Table 4 - Resources and databases on ICT and jurisprudence
- Table 5 - International human rights law landscape
Suggested readings:
- Association for Progressive Communications, APC policy explainer: Platform Responsibility and Accountability (November 2020)
- Global Network Initiative, Addressing Digital Harms AND Protecting Human Rights — GNI Shares Recommendations for Policymakers
- APC, Reorienting rules for rights: A summary of the report on online content regulation by the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression
- Manila Principles on Intermediary Liability
- Santa Clara Principles on Transparency and Accountability in Content Moderation
- UN Guiding Principles on Business and Human Rights
- Asia Pacific Survey on Fake News and Intermediary Liability
- AIC Releases Toolkit on Addressing Online Misinformation Through Legislation – POFMA
Additional readings:
- Comparative Analysis of National Approaches of the Liability of the Internet Intermediaries: Malaysia
- Roberts, Sarah T. 2017. “Content Moderation” edited by L. A. Schintler and C. L. McNeely. Encyclopedia of Big Data.
- Matias, J. Nathan. 2019. “The Civic Labor of Volunteer Moderators Online.” Social Media & Society 5(2):205630511983677.
- Jeong, Sarah. 2016. “The History of Twitter’s Rules.” Motherboard / Vice, January 14.
- Newton, Casey. 2019. “Bodies in seats”/ The Verge, June 19
- The Ringer, The BTS Army and the Transformative Power of Fandom As Activism (June 2020)
No Comments