ICT4Peace is delighted to release Episode 7 of Brown Bag, the ICT4Peace podcast series on social media, politics, democracy and society from a Global South perspective, hosted and produced by ICT4Peace Foundation’s Special Advisor, Dr Sanjana Hattotuwa.
In Episode 7, Hattotuwa speaks with Elina Noor who is presently Director, Political-Security Affairs and Deputy Director, Washington, D.C., and also a Senior Advisor and a member of the ICT4Peace Foundation’s International Advisory Board.
Listen to it on SoundCloud, or through the embedded player below.
rown Bag is also on Spotify, Google Podcasts, Amazon Music and Apple Podcasts.
The production of this podcast series is supported by the Daniel Gablinger Foundation.
Elina starts by reflecting on how much online content has changed the study, generation, and propagation of violent extremism. This feeds into a critique of the capacity, and capabilities of leading social media companies to oversee the propagation of harms on their platforms.
Noor then addresses the structural impediments to a grounded understanding of the Global Majority, and points out the issue of language – where if it isn’t in English, knowledge, and lived experience is likely to not be recognise by Global North scholarship, and how social media companies also inform themselves on context.
The conversation then focusses on the acquisition of Twitter by Musk, and more broadly speaking, the responsibilities of social media companies beyond a profit imperative, cognisant that it is model which has led to violent offline harms, in addition to enduring online violence. Noor talks about the problems around how countries with a democratic deficit have adopted instruments like Europe’s GDPR, and what this means for users in authoritarian states. Noor also talks about the limitation of ‘free speech’ from a post-colonial lens.
Noor then focusses on the challenges related to trust, and safety issues, in the context of significant, and growing harms on platforms which have gone beyond the ability of predominantly human moderation to manage.
The conversation moves to black box algorithms, and moves led by the Christchurch Call, to better understand how algorithms work to present content, and prevent harms.