Social Media Content Regulation Requires G7 Consensus, Think Tank Says
ITIF’s report, “How to Address Political Speech on Social Media in the United States,” outlines a three-part plan for overcoming the politicization and polarization that have stymied the debate over how social media platforms should moderate misleading and harmful content. The report said the most important part of this multi-pronged approach is establishing an international forum “to develop a set of voluntary, consensus-based guidelines for social media companies to follow when moderating online political speech.”
Rather than creating “one-size-fits-all recommendations about specific types of content that should be allowed or not allowed on all platforms,” the report said that this multistakeholder forum of G7 nations—which it referred to as the “International Forum on Content Moderation”—would focus on developing “content moderation processes social media platforms can use to address controversial content moderation questions and improve the legitimacy of their content moderation practices.”
“These content moderation processes should respect transparency, accountability and due process and balance goals such as free speech with reducing consumer harm,” the report added.