Technology & Human Rights

Examining how technological advancements affect the future of human rights.

While recognizing the enormous progress that societies have made since the establishment of the Universal Declaration of Human Rights in 1948, technological advancements have inevitably profound implications for the human rights framework.

From a practical perspective, technology can help move the human rights agenda forward. For instance, the use of satellite data can monitor the flow of displaced people; artificial intelligence can assist with image recognition to gather data on rights abuses; and the use of forensic technology can reconstruct crime scenes and hold perpetrators accountable. Yet for the multitude of areas in which emerging technologies advance the human rights agenda, technological developments have equal capacity to undermine efforts. From authoritarian states monitoring political dissidents by way of surveillance technologies, to the phenomenon of “deepfakes” destabilizing the democratic public sphere, ethical and policy-oriented implications must be taken into consideration with the development of technological innovations.  

Technological advancements also introduce new actors to the human rights framework. The movement has historically focused on the role of the state in ensuring rights and justice. Today, technological advancements and the rise of artificial intelligence and machine learning, in particular, necessitate interaction, collaboration, and coordination with leaders from business and technology in addition to government.

News and Announcements

See all announcements

Select Publications

Stop Surveillance Humanitarianism

Citation:

Mark Latonero. 7/11/2019. “Stop Surveillance Humanitarianism.” The New York Times. See full text.
Stop Surveillance Humanitarianism

Abstract:

Mark Latonero – Carr Center Technology and Human Rights Fellow, and research lead at Data & Society – discusses surveillance humanitarianism for The New York Times

A standoff between the United Nations World Food Program and Houthi rebels in control of the capital region is threatening the lives of hundreds of thousands of civilians in Yemen.

Alarmed by reports that food is being diverted to support the rebels, the aid program is demanding that Houthi officials allow them to deploy biometric technologies like iris scans and digital fingerprints to monitor suspected fraud during food distribution.

The Houthis have reportedly blocked food delivery, painting the biometric effort as an intelligence operation, and have demanded access to the personal data on beneficiaries of the aid. The impasse led the aid organization to the decision last month to suspend food aid to parts of the starving population — once thought of as a last resort — unless the Houthis allow biometrics.

Read the full article.

: Mark Latonero | Sept 11, 2019
: Fellow Mark Latonero discusses surveillance humanitarianism for The New York Times. 
Last updated on 02/03/2020

Embedding Ethics in Computer Science Curriculum

Citation:

Kate Vredenburgh. 1/25/2019. “Embedding Ethics in Computer Science Curriculum.” The Harvard Gazette. See full text.
Embedding Ethics in Computer Science Curriculum

Abstract:

New article in The Harvard Gazette features work by Carr Center Technology and Human Rights Fellow Kate Vredenburgh.
 

"A module that Kate Vredenburgh, a philosophy Ph.D. student, created for a course taught by Professor Finale Doshi-Velez asks students to grapple with questions of how machine-learning models can be discriminatory, and how that discrimination can be reduced."

: Kate Vredenburgh | Jan 25 2019
: Tech Fellow Kate Vredenburgh's model explores how we can embed ethics into machine learning.
Last updated on 02/03/2020

The Quest For Inclusive & Ethical Technology

Citation:

Sabelo Mhlambi. 6/10/2019. “The Quest For Inclusive & Ethical Technology.” WUWM Milwaukee NPR. Bonnie North. See full text.
The Quest For Inclusive & Ethical Technology

Abstract:

New interview with Technology and Human Rights Fellow Sabelo Mhlambi.

"Most of us think of technology as a neutral force. Objects or processes are designed and implemented to solve problems and there are no biases, implied or overt, at work. But Sabelo Mhlambi says, not so fast. The computer scientist and researcher says technology cannot be neutral. What gets made, who makes it and uses it, and why is dependent upon our societies — and all societies are biased.

"Technology will only replicate who we are," he explains. "Our social interactions will still occur online anyway. So, there’s nothing magical about technology where it somehow brings neutrality or brings equality or equity."

https://www.wuwm.com/post/quest-inclusive-ethical-technology

: Sabelo Mhlambi | June 10 2019
: Tech Fellow Sabelo Mhlambi discusses the biases embedded into technology for Milwaukee's NPR.
Last updated on 02/03/2020
See all publications

“Global civil society and transnational advocacy networks have played an important role in social movements and struggles for social change. Looking ahead, these movements need to coalesce around the impact of technology on society, in particular harnessing the promise, challenging the perils, and looking at maintaining public and private spheres that respect creativity, autonomy, diversity, and freedom of thought and expression.”

- Sushma Raman