During the second session of the Harvard National Model United Nations (HNMUN) conference, delegates in the Special Summit on Sustainable Development hear from Sushma Raman, the Executive Director of the Carr Center for Human Rights.... Read more about Special Summit on Sustainable Development Features Guest Speaker
The Carr Center for Human Rights Policy serves as the hub of the Harvard Kennedy School’s research, teaching, and training in the human rights domain. The center embraces a dual mission: to educate students and the next generation of leaders from around the world in human rights policy and practice; and to convene and provide policy-relevant knowledge to international organizations, governments, policymakers, and businesses.
The dominant vision of artificial intelligence imagines a future of large-scale autonomous systems outperforming humans in an increasing range of felds. This “actually existing AI” vision misconstrues intelligence as autonomous rather than social and relational. It is both unproductive and dangerous, optimizing for artificial metrics of human replication rather than for systemic augmentation, and tending to concentrate power, resources, and decision-making in an engineering elite. Alternative visions based on participating in and augmenting human creativity and cooperation have a long history and underlie many celebrated digital technologies such as personal computers and the internet. Researchers and funders should redirect focus from centralized autonomous general intelligence to a plurality of established and emerging approaches that extend cooperative and augmentative traditions as seen in successes such as Taiwan’s digital democracy project and collective intelligence platforms like Wikipedia.
Cities have emerged as test beds for digital innovation. Data-collecting devices, such as sensors and cameras, have enabled fine-grained monitoring of public services including urban transit, energy distribution, and waste management, yielding tremendous potential for improvements in efficiency and sustainability. At the same, there is a rising public awareness that without clear guidelines or sufficient safeguards, data collection and use in both public and private spaces can lead to negative impacts on a broad spectrum of human rights and freedoms. In order to productively move forward with intelligent-community projects and design them to meet their full potential in serving the public interest, a consideration of rights and risks is essential.
Abstract:Just as rights are not static, neither is harm. The humanitarian system has always been critiqued as arguably colonial and patriarchal. As these systems increasingly intersect with Western, capitalist technology systems in the race of “for good” technology, how do governance systems ethically anticipate harm, not just now but into the future? Can humanitarian governance systems design mitigation or subversion mechanisms to not lock people into future harm, future inequity, or future indebtedness because of technology design and intervention? Instead of looking at digital governance in terms of control, weaving in foresight and decolonial approaches might liberate our digital futures so that it is a space of safety and humanity for all, and through this, birth new forms of digital humanism.
Read the paper.
“The Carr Center is building a bridge between ideas on human rights and the practice on the ground. Right now we are at a critical juncture. The pace of technological change and the rise of authoritarian governments are both examples of serious challenges to the flourishing of individual rights. It’s crucial that Harvard and the Kennedy School continue to be a major influence in keeping human rights ideals alive. The Carr Center is a focal point for this important task.”
- Mathias Risse