Steven Livingston


STEVEN LIVINGSTON - Closing Assessment

Two principal ideas shaped the organization and execution of the Carr Center Conference on Technology and Human Rights: The first was Kathryn Sikkink’s observation that those who abuse the rights of others often attempt to hide their actions from outside observers.  Human rights investigators must, as she and Margaret Keck put it, overcome “the deliberate suppression of information that sustains many abuses of power” (Keck and Sikkink, Kindle Locations 77-78).  Information about abuse is sometimes obscured by environmental factors, such as distance and remoteness.  In other cases, investigations by fact-finding teams are too dangerous.  In other cases, events unfold too rapidly to be reported by conventional means.  And even if direct access by outside investigators can be realized, the perpetrators often destroy evidence: documents are destroyed and bodies obliterated.  How do the technologies discussed at the conference address those conditions?

stevenlGeospatial, forensic and digital network sciences and technologies help overcome inaccessibility of information.  A growing number of satellites observe events on the ground with powerful sensors.  At the same time, local inhabitants, armed with mobile phones and cameras, record events and their aftermath and share information over digital networks (sometimes intentional and sometimes not) with distant investigators.  Perpetrators even post boastful accounts of their deeds on social media.  In some cases, forensic scientists exhume graves, collect antemortem evidence, take DNA samples from living relatives and from the remains of the dead.  In short, inaccessibility and suppression of information is today challenged by digital technologies.  Even when fact-finding missions are undertaken, these technologies gather information that corroborate evidence obtained by interviewing witnesses and survivors and gathering documentary evidence.  Using digital technology, human rights organizations undermine the deliberate suppression of information.

As promising as this sounds, some conference attendees expressed cautions and concerns. 

Although technologies as distinct as remote sensing satellites and DNA sequencing machine can complement and strength human rights investigations, they do not do so seamlessly and without tension.  Professionals using different technological platforms approach investigations quite differently.  To put the idea in scholarly terms, geospatial, network, and forensic technologies create highly complementary affordances (Livingston, 2016).  An affordance is a feature of a technology that invites a range of potential uses by its very design.  Geospatial affordances invite observation of distant events; network affordances invite decentralized collaborative work on a common goal or problem; and forensic affordances invite the digitization of physical material – such as a bone fragment – that reestablishes identity.  As seamless as all of this is conceptually, in practice technologies come with professional communities of practice.  A community of practice consists of people with common professional training, sometimes common mentors (such as Clyde Snow and Eric Stover), professional and ethical standards, associations, personal networks and shared experiences, and standard procedures.  Collaboration among the technological affordances is mediated by respective communities of practice.  Rather than a seamless blending of technological potentials, collaborations might look more like large weddings with culturally distinct families awkwardly working out new relationships and family dynamics.   It does not always go according to plan.

There were also deeper concerns expressed by some participants.  One participant pointed to what might be thought of as a culture-clash between some working in the new generation of human rights technologists and traditional human rights practices and principles. Some technologists lack appreciation for the tried and true principles of traditional human rights investigations, principles that take great care with handling information that might put lives at risk if handled inappropriately.  For some technologists, openness and transparency is an ideological commitment.  It is, too, for human rights groups, though it must be tempered by an awareness of the broader effects of transparency.   Some noted, for example, that DNA sequencing produces extremely sensitive information.  For example, it is not uncommon for investigators to discover evidence at odds with assumed parental lineages.  In some countries, such information carries potentially dire consequences.  Likewise, as sensors on satellites and, especially, civilian drones become more powerful, care must be taken with georectified images.

Secondly, as costs diminish and availability grows, the potential for misinterpreting satellite images grows, as does the creation of disinformation.  Groups without the in-house technical capacity or the budget to outsource it to professionals might misinterpret images, and in the process cloud awareness more than clarify it.  This can be dangerous. 


As satellite revisit frequencies improve, approaching persistent surveillance in some locations, human rights organizations will be tempted to become, as one conferee put it, a counterintelligence arm of the abused and oppressed party to a conflict.  But can an NGO become an operational element in a conflict?  At one point in 2012, HHI’s Satellite Sentinel Project realized that it could not use its awareness of a pending attack on a South Sudanese community by Sudanese forces because it could not justify the possibility of “getting it wrong.”  If HHI were to tell the threatened population to flee, it might very well-result in it heading straight into the attack.  Yet doing nothing would also have consequences.  Although NGOs now sometimes have operational capacities in conflict zones, they lack an ethical framework that would permit the role that modern technology invites. 

Third, mobile telephony has allowed those directly caught up in a conflict and subject to war crimes and human rights abuses to reach out to the outside world.  Tweets, text messages, Facebook and other social media posts offer a running record of events, even when journalists and human rights investigators are barred.  But can human rights organizations ask locals to investigate abuses, especially those who lack the training that usually precedes the role?  If a human rights organization asks for a photo or video of an event or an area (such as the location of a mass grave), and the content includes metadata that ties it with the person who collects and sends it, is the organization responsible for that person’s well-being when authorities arrest him or her because the data transfer was intercepted? 

A fourth concern expressed by the participants involved NGO alliances with information technology corporations.  Few NGOs have the technical capacity to manage and analyze big data.  The Carter Center’s Syria event mapping initiative  for example, partners with Palintir, a data analytics company, to analyze the data used to populate the map.  Palintir received its startup funding from In-Q-Tel, the CIA’s startup funding arm, and from Peter Theil, the controversial Silicon Valley billionaire.   Besides the Carter Center, Palintir has worked with several U.S. government groups, including the CIA, DHS, NSA, FBI, the Marine Corps, the Air Force, Special Operations Command, and West Point.  In another example, DigitalGlobe, the commercial satellite remote sensing company, recently acquired the Radiant Group, a company with close ties to the intelligence community, including the National Reconnaissance Office. As human rights organizations turn to technology they also venture into terrain dominated by corporations that are adjuncts of the intelligence and defense community.

Finally, Kathryn Sikkink wondered if some of today’s common pessimism about human rights is the result of measurement biases created by more powerful and available technology.  If one thinks of information technology as a net pulled through human experience, what was once a broad-gauge net is now fine.  Fewer events pass through without capture.  This leaves open the possibility that the number of events over time have remained stable, or perhaps even diminished, yet the higher-gauge – the more powerful and available technologies – captures more of them, leaving the impression that abuses are more common.  Secondly, four decades of human rights work has created a global awareness and sensitivity to abuses that would have been taken-for-granted in an earlier era.  Awareness and technical capacity combine to create false impressions.

The use of digital technologies by human rights groups has emerged as an essential focus of ongoing scholarly attention.  The Carr Center for Human Rights Policy conference made a significant contribution to furthering our understanding of their use.