Scott Edwards, Senior Advisor, Amnesty International

 

Scott Edwards, Senior Advisor, Amnesty International

scotteEdwards began by noting the enormous volume of information that is now available.  Every minute another 300 hours of video is uploaded to YouTube, and that is just one of several platforms. Data exhaust is everywhere: shopping data and CCTV are just a few more examples. Ubiquitous data are creating operational challenges. Amidst the sea of digital data, how do you find the human rights violations? This is a “noise” problem. How do you sort the food and kitten photos so heavily represented in social media from the signals, the photos that will help the human rights researcher find valuable evidence? How do you sort and secure information shared via social media? Data overload also created challenges for simply organizing and making sense of it, as well as securing it, and protecting its probative value.  Evidence that is improperly handled is worthless in a tribunal.  There are also issues of scale and scope.  Are we looking for that one airstrike in Gaza or are we looking for patterns and trends that would demonstrate systematic and sustained forms of abuse?  These two scenarios present different data collection requirements.  There are also huge challenges around verification, explained Edwards. It is sometimes difficult to get access on the ground to check if video evidence is accurate.   Verification calls for direct contact between the human rights organization and the person who took and uploaded the video or image.  Yet it is essential that they take precautions and not expose themselves to risk. Furthermore, the expertise that is required to verify open source data overwhelms most human rights organizations.  A researcher needs to know ballistics, geology, pathology, and other related fields.  That is not sustainable.

The ubiquity of the data means there will be more opportunities for triangulating evidence from different platforms: human, geospatial, and open source intelligence.  Examples would include matching up perpetrator video uploaded onto social media, a human intelligence network, satellite imagery, and then getting defense ministries to identify specific units and even commanders.

Securing information is a big concern.  Amnesty cannot rely on a social media platform, no matter how well intentioned, to secure video or other data.  Amnesty is working with technologists to address this issue.

With respect to analysis, algorithms will save the day, but only to an extent.  Social computation is also needed. This involves recruiting large numbers of volunteers over a digital network to analyze small portions of a larger data set, such as a satellite image.  Examples of would include Decode Darfur, Amnesty’s effort to monitor attacks on the Darfur region of the Sudan.  Edwards distinguishes social computation from crowdsourcing, which for him involves the public – the crowd – to volunteer information over a digital platform.  Asking for information is taken seriously by Amnesty and is regarded as an act one that can carry serious potential risks, as was noted above.  Verifying information is always a challenge.  Human Rights Investigations Lab at UC-Berkley is working with Amnesty to train students in open source research methods.  University of Pretoria and the University of Essex are doing the same.