{"version":"1.0","type":"rich","provider_name":"Acast","provider_url":"https://acast.com","height":250,"width":700,"html":"<iframe src=\"https://embed.acast.com/$/60518a52f69aa815d2dba41c/632b985265cff10013056cfb?\" frameBorder=\"0\" width=\"700\" height=\"250\"></iframe>","title":"Dan Byman on Content Moderation Tools to Stop Extremism","thumbnail_width":200,"thumbnail_height":200,"thumbnail_url":"https://open-images.acast.com/shows/60518a52f69aa815d2dba41c/show-cover.png?height=200","description":"<p>There's enormous debate about how much social media platforms should be doing to moderate extremist content. But that debate often lacks nuance about the many different ways that platforms can moderate and that moderation is not an all or nothing proposition.&nbsp;</p><p>Daniel Byman is a professor at Georgetown University's School of Foreign Service and <em>Lawfare</em>’s foreign policy editor. He recently published a paper for <em>Lawfare</em>’s ongoing Digital Social Contract Research Paper series in which he lays out the many different ways that platforms can and do moderate content. <em>Lawfare</em> senior editor Alan Rozenshtein spoke with Dan about his research and how it can inform not just more but better moderation.</p>","author_name":"The Lawfare Institute"}