Over 2017, more than any previous year since the launching of ICT4Peace in 2004, the use of technology for violent as well as peaceful ends dominated global news headlines. We live in a complex, confusing, connected world. More than ever before, how we engage with, perceive and value society and polity is framed by what we consume over our mobile devices, over social media and online. Trust, privacy, and indeed, the very essence of democratic and electoral processes are now governed by complex algorithms, in turn increasingly fuelled by proprietary artificial intelligence architectures that can handle the exponentially increasingly volume, vector and velocity of content creation, human curation or oversight just cannot keep up with. All this and more have a profound impact on what we do, how we see each other, and how we choose to govern ourselves, irrespective of whether we are actively consumers or producers of this content. Today, news headlines themselves are under unprecedented scrutiny, as misinformation particularly over social media and the Internet increasingly becomes indistinguishable, even for experts, from accurate, fair and factual reportage. Tomorrow, it will be even more difficult if not downright impossible to distinguish between what is factually accurate, and what has been engineered to promote a particular viewpoint or idea.

At the ICT4Peace Foundation, these challenges form the core of what we do at an inter-governmental, international level with key policymakers and influencers as well as at the grassroots, with leading civil society institutions, activists and journalists, across five continents. We are confronted daily with two questions – why do we do what we do, and what difference are we making. Our search for meaningful answers to these two questions directly drives our engagement with stakeholders across the world, and is our motivation to deliver timely, expert, strategic input into processes that, by responding to critical challenges today, are shaping the future of the world.

We have on the one hand the rapid, sustained maturity of technologies and communities dealing with man-made and natural crises. The UN system, for example, now integrates by default volunteers from around the world over virtual platforms, embraces technical expertise from the private sector and incorporates verified information in the public domain, produced by agentive victims as well as a wide-spectrum of citizens, in the assessment of ground conditions and policymaking around suitable responses. This level of coordination and collaboration didn’t exist a few years ago, and is testimony to how much the working culture has changed, undergirded with the rapid progress and use of new technologies. Sharing information is the new norm, not hoarding it. Fostering innovation is now a common pursuit and goal, instead of harbouring proprietary solutions around shared challenges. Human resources around analysis, visualisation, machine learning, artificial intelligence and strategic visioning is now valued over more traditional jobs and job descriptions.

On the other hand, significant challenges persist. The rapid evolution of technology brings with it attendant problems around sustainability. A rights-based perspective around the development and deployment of technology, especially post-disaster, is still embryonic. Indeed, the incorporation of rights and ethics into artificial intelligence and machine learning remains peripheral, leading to increased risk for service providers as well as intended beneficiaries. Outmoded and outdated human resource management frameworks stymie the entry of talented individuals to fill key posts to help governments, the UN, civil society and other stakeholders deal with complex challenges and wicked problems. The Foundation sees an emphasis on, interest in and development of advanced technologies, with the best of intent, that too often aren’t in line with existing best practices around working in complex political emergencies, violent conflict and long-standing humanitarian challenges like refugees or internally displaced persons, leading to harmful, unintended consequences.

How do we realise the potential of frontier issues like artificial intelligence, and at the same time ensure those most at risk of discrimination, displacement and persecution today, don’t face ever greater threats? This is no simple task, and there is no panacea. At the UN, regional and inter-governmental level, the Foundation over 2017 supported and hosted critical discussions around the need for norms of responsible behaviour and trust and  confidence building measures in the cybersecurity discourse. These interactions have forged meaningful new alliances with Governments, UN actors, leading and start-up technology companies, academic institutions and regional groupings to combat the use of social media platforms for violent ends, and in particular terrorist purposes. In Afghanistan, Myanmar, Maldives, Nepal and the Balkans, the Foundation has through public engagements on capacity building, strategic input and on the ground assessments or by way of confidential briefings based on expert analysis, helped in countering violent extremism online, combat misinformation over social media as well as strengthen the digital security of activists. We have produced in-depth working papers looking at autonomous agents and artificial intelligence in theatres of operation entirely outside war, and war-time uses. Our policy briefings on cybersecurity have been widely quoted and used. We have been featured in international media as experts in this domain, and recognised at the UN as key interlocutors in efforts to address violent content online, in a manner that respects rights and privacy. We have late in 2017 embarked on an ambitious effort to capture the UN’s tryst with technology in disaster response, mirroring a stocktaking exercise conducted in 2008 on the same lines, in order to ascertain what’s changed and provide recommendations on what more can be done. This year we also pursued our work launched in fall 2016 in the field of Ethics and Technology, looking in particular at the issues related to Lethal Autonomous Weapons Systems (LAWS).

For many years, the Foundation has acted as a trusted observatory of sorts – looking ahead farther than most can see, marrying developments in one domain with the impact on other sectors, highlighting the need for and role of ethics and human rights, providing timely, trusted and strategic input into processes as wide-ranging as peacekeeping to sensitive, complex, inter-governmental negotiations on cybersecurity policy. We are at the cutting-edge of international diplomacy, deeply involved at the highest levels of policy making, and at the same time, working with, attentive towards and trusted by actors at the grassroots, across multiple countries and contexts. This is a rare, and dare I say, unique combination of skills, talent and experience.

I believe this is the value we bring, over and above what is today a burgeoning field of actors involved in the use of technology for peaceful purposes. Our mandate, anchored in Paragraph 36 of the Tunis Commitment of the World Summit on Information Society, remains as relevant today as it was in 2005. My introduction to the Foundation’s first publication in 2005, The Role of ICT in Preventing, Responding to and Recovering from Conflict, ended by noting that technology offered a true hope for peace, by promoting a better understanding among people. Despite all the negative, depressing headlines this year, the work of the Foundation, its core staff and its partners, documented on our website, Facebook page and over Twitter, is a vital record of how much we have achieved in pushing forward an understanding that as much as technology creates challenges, it also offers solutions. Our work, the Foundation, and everything I believe is anchored to this core belief – that we can strengthen our better angels through technology. To change what we can, and to fight against injustice and violence – this is what we do. To do so innovatively, using technology, is what we help with. Over 2018 and beyond, I invite you to support and join us in this worthy endeavour.

Danial Stauffacher
President
ICT4Peace Foundation