The ICT4Peace Foundation hosted and moderated a discussion on Big Data at the International Conference of Crisis Mappers (ICCM) 2013, held in Nairobi, Kenya from 20 – 22 November at the United Nations Office (UNON).

Moderated by Sanjana Hattotuwa, Special Advisor at the ICT4Peace Foundation the panel featured the distinguished Jon Gosier (D8A Group), Anahi Iayala Iaccuci (Internews) and Emmanuel Letouzé (University of California-Berkeley).

Sanjana, to the mirth of many, started off the panel by quoted Daniel Ariely’s definition of Big Data,

Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it…

The discussion was based around the following note and issues, penned by the Foundation,

Big Data is the new black. Numerous reports, articles and even entire books are devoted to how Big Data, variously defined, offers news ways to better our lives. Humanitarian aid and relief organisations are themselves publicly thinking about how best to embrace a world awash in information. Whereas just a few years ago, the challenge was to capture and generate information around and on humanitarian disasters and protracted conflict, today it is more about how best to select, verify and then action relief and aid using information in the public domain. Though the potential of Big Data is often flagged, there is still a lack of evidence based discussion on how Big Data really helps aid and relief, conflict prevention, community resilience, public empathy, timely response and long-term engagement with complex crises. Is Big Data a passing fad? Does Big Data disempower local communities as much as it can democratise data analysis? How can we address challenges of data retention, the right to be forgotten and the ethics of using and archiving rapid assessment data over the longer term? What if any are new responsibilities of humanitarians, including volunteers, to ensure increasingly large and comprehensive datasets, often generated in good faith and freely available, aren’t leveraged to discriminate and harm? How can we ensure that Big Data empowers individuals over institutions and that it helps communities themselves to mitigate, respond to and recover from conflict and disasters? How should we capture best practices and innovative thinking around the generation and use of Big Data? How can we integrate a rights based perspective, including a gendered critique, in Big Data debates? This panel will explore these issues with a robust examination of Big Data’s role and relevance in addressing some of the most pressing challenges facing communities, governments, civil society, the international system and the aid community today.

In preparation for the panel, the Foundation curated two resources around Big Data – a FlipBoard magazine and a Tiki Toki timeline. These proved to be immensely valued by those present at ICCM, with a number of influential thought-leaders retweeting them during the panel discussion itself.

After the initial presentations, Sanjana flagged,

  • The novel by David Eggers called ‘The Circle’, which painted a dystopian future based on, in effect, big data, and how fiction could help understand some of the challenges around big data’s generation and use today
  • Flagged that in addition to Jon Gossier’s velocity, volume and variety, there was also the problem of veracity, especially during emergencies and sudden onset disasters
  • Flagged the need for a rights perspective as well as the need for ethical guidelines around big data. Sanjana flagged the example of how insurance companies could use big data first generated around a disaster, to help relief and aid, years after to increase insurance premiums for certain segments of a population, based on what they had revealed for a very different purpose and need at the time.
  • The need for a gendered approach to big data – that in its generation, use, dissemination, perception and re-appropriation, gender considerations played a vital role in inscribing both the data, and its application
  • The Snowden effect, and how big data around citizens by intelligence agencies with the collusion of large corporate entities impacted the perception of big data in general
  • That uses of big data today could provide good enough pointers to issues and locations that needed more rigorous attention, even if it didn’t directly attribute causation. That even if intended audiences of big data research were apathetic, the availability of information in the public domain would over time transform accountability.

In their presentations, Jon Gosier from D8A Group and Appfrica talked about,

How big data and small data technologies are finding their way into every day products, and how the technologies might be further improved for humanitarian purposes. He also have four specific use cases where he’s used big data in his own work to solve crisis/disaster related problems. He also introduced a new product – SiftDeck – that can used for as much.

Jon’s presentation is embedded below, and can be viewed here.

Anahi Ayala Iacucci, Senior Innovation Advisor, Internews noted,

How big data for humanitarian purposes can be extremely useful for understanding and analyzing Information Ecosystems, and support communication with disaster affected communities. Looking at the issue from that perspective she touched on the challenges and possible risks of big data for humanitarian purposes namely the “de-humanization” of human being personal stories, and the missing context of those personal stories. She also talked about the lack of big data, and focussed on places like Mali, Central African Republic and Somalia, where data is either missing or risky, and how the rise of big data need to take into consideration the lack of data too.

Emmanuel Letouzé, PhD Candidate, UC Berkeley, Fellow, Harvard Humanitarian Initiative and Non Resident Adviser, International Peace Institute talked about,

The main parameters/arguments of the opposite and (largely) stereotypical perspectives of the “technoptismists” and the “skepnologists” when it comes to Big Data’s applications and implications for humanitarian assistance – which will provide an opportunity to clarify what I refer to as Big Data. He presented a couple recent noteworthy academic contributions with potential relevance to the field – especially cell-phone data analysis, noting how there seems to be a (welcome) heavier focus on theoretical and methodological aspects. Thirdly he highlighted 3 areas that he saw as critical for future applications of ‘Big Data’ in crisis and humanitarian contexts: (1) the grounding of the field’s development on a number of key principles, (2) the development and diffusion of a corpus of (statistical) methods to deal with sample bias notably and (3) progress in governance – regarding data sharing and analysis frameworks balancing various / competing priorities (especially privacy-confidentiality-security; social value; and commercial value), backed by adequate technological tools (re anonymization for instance) and systems.