Search

Search results

    The Future Impact of Artificial Intelligence on Humans and Human Rights
    Steven Livingston and Mathias Risse. 6/7/2019. “The Future Impact of Artificial Intelligence on Humans and Human Rights.” Ethics and International Affairs, 33, 2, Pp. 141-158. See full text.Abstract
    What are the implications of artificial intelligence (AI) on human rights in the next three decades?

    Precise answers to this question are made difficult by the rapid rate of innovation in AI research and by the effects of human practices on the adaption of new technologies. Precise answers are also challenged by imprecise usages of the term “AI.” There are several types of research that all fall under this general term. We begin by clarifying what we mean by AI. Most of our attention is then focused on the implications of artificial general intelligence (AGI), which entail that an algorithm or group of algorithms will achieve something like superintelligence. While acknowledging that the feasibility of superintelligence is contested, we consider the moral and ethical implications of such a potential development. What do machines owe humans and what do humans owe superintelligent machines?

    Read the full article here

    Critical Skill for Nonprofits in the Digital Age: Technical Intuition
    Alix Dunn. 5/7/2019. “Critical Skill for Nonprofits in the Digital Age: Technical Intuition.” Stanford Social Innovation Review. Listen to the Interview.Abstract

    Not everyone needs to become a tech expert, but all activists and nonprofit leaders must develop skills to inquire about, decide on, and demand technological change. Tech Fellow Alix Dunn talks to Stanford's Social Innovation Podcast. 

    In a world where the pace of organizational learning is often slower than the pace of technological change, activists and nonprofit leaders must develop their “technical intuition.” Not everyone needs to become a tech expert, explains Alix Dunn, of the consulting firm Computer Says Maybe, but this ongoing process of imagining, inquiring about, deciding on, and demanding technological change is critical.

    In this recording from the Stanford Social Innovation Review's 2019 Data on Purpose conference, Dunn walks through her guidelines to help anyone to develop these skills.

    We Can't Future-Proof Technology. But Here are 5 Ways to Forward Plan.
    Alexa Koenig and Sherif Elsayed-Ali. 1/5/2019. “We Can't Future-Proof Technology. But Here are 5 Ways to Forward Plan.” World Economic Forum . See full text.Abstract
    New article co-authored by Carr Center Technology and Human Rights Fellow Sherif Elsayed-Ali.

    "We know that the technologies of the Fourth Industrial Revolution are drastically changing our world. This change is happening at a faster rate and greater scale than at any point in human history – and with that change come significant challenges to the ability of our public institutions and governments to adequately respond.

    From the plough to vaccines to computers, technological innovations have generally made human societies more productive. Over time, people have figured out how to mitigate their negative aspects. For example, electrical applications are much safer to use now than in the early days of electrification. Though we came close to disaster, since the Second World War the international political system has managed to contain the threat of nuclear weapons for mass destruction.

    However, the accelerating pace of change and the power of new technologies mean that negative unintended consequences will only become more frequent and more dangerous. What can we do today to help ensure that new technologies make life better, not worse?"

    https://www.weforum.org/agenda/2019/01/how-to-plan-for-technology-future-koenig-elsayed-ali/

    The Quest For Inclusive & Ethical Technology
    Sabelo Mhlambi. 6/10/2019. “The Quest For Inclusive & Ethical Technology.” WUWM Milwaukee NPR. Bonnie North. See full text.Abstract
    New interview with Technology and Human Rights Fellow Sabelo Mhlambi.

    "Most of us think of technology as a neutral force. Objects or processes are designed and implemented to solve problems and there are no biases, implied or overt, at work. But Sabelo Mhlambi says, not so fast. The computer scientist and researcher says technology cannot be neutral. What gets made, who makes it and uses it, and why is dependent upon our societies — and all societies are biased.

    "Technology will only replicate who we are," he explains. "Our social interactions will still occur online anyway. So, there’s nothing magical about technology where it somehow brings neutrality or brings equality or equity."

    https://www.wuwm.com/post/quest-inclusive-ethical-technology

    Embedding Ethics in Computer Science Curriculum
    Kate Vredenburgh. 1/25/2019. “Embedding Ethics in Computer Science Curriculum.” The Harvard Gazette. See full text.Abstract
    New article in The Harvard Gazette features work by Carr Center Technology and Human Rights Fellow Kate Vredenburgh.
     

    "A module that Kate Vredenburgh, a philosophy Ph.D. student, created for a course taught by Professor Finale Doshi-Velez asks students to grapple with questions of how machine-learning models can be discriminatory, and how that discrimination can be reduced."

    Reclaiming Stonewall: Welcome to the Celebration—and the Struggle
    Timothy Patrick McCarthy. 6/24/2019. “Reclaiming Stonewall: Welcome to the Celebration—and the Struggle.” The Nation. See full text.Abstract

    As we reckon with the 50th anniversary of Stonewall, it is essential that we ask, “What still needs to be done?”

    "Fifty years ago, in the early morning hours of June 28, 1969, a motley multitude of queer folks fought back. The stage was the Stonewall Inn, a popular Mafia-owned gay bar on Christopher Street in New York City’s West Village. The spectacle was a police raid, which had become an increasingly routine fact of queer life during the 1960s. It was summer, people were hot, and the nation was pulsing with protest."

    Read more.

    Stop Surveillance Humanitarianism
    Mark Latonero. 7/11/2019. “Stop Surveillance Humanitarianism.” The New York Times. See full text.Abstract
    Mark Latonero – Carr Center Technology and Human Rights Fellow, and research lead at Data & Society – discusses surveillance humanitarianism for The New York Times

    A standoff between the United Nations World Food Program and Houthi rebels in control of the capital region is threatening the lives of hundreds of thousands of civilians in Yemen.

    Alarmed by reports that food is being diverted to support the rebels, the aid program is demanding that Houthi officials allow them to deploy biometric technologies like iris scans and digital fingerprints to monitor suspected fraud during food distribution.

    The Houthis have reportedly blocked food delivery, painting the biometric effort as an intelligence operation, and have demanded access to the personal data on beneficiaries of the aid. The impasse led the aid organization to the decision last month to suspend food aid to parts of the starving population — once thought of as a last resort — unless the Houthis allow biometrics.

    Read the full article.

    On the Future of Human Rights. CCPD 2019-008.
    Sherif Elsayed-Ali. 7/12/2019. On the Future of Human Rights. CCPD 2019-008.. Carr Center for Human Rights. See full text.Abstract

    The human rights framework has had many successes in the 70 years since the Universal Declaration of Human Rights was adopted, but is still relevant to today’s challenges?

    In the last few years, human rights practitioners have raised the alarm on what seems like sustained attacks on human rights from some governments. But there is a bigger threat to the future of human rights: that people could see them as less relevant to their lives. My aim is to provide a constructive critique of human rights practice and messaging, together with three main proposals: 1. putting climate change at the top of the human rights agenda; 2. significantly increasing the amount of work on Economic, Social and Cultural (ESC) rights undertaken by human rights advocacy and campaigning organizations, and 3. adopting a system-analysis and solutions-based approach to human rights. 

    Trump wants to “detect mass shooters before they strike.” It won’t work.
    Desmond Patton. 8/7/2019. “Trump wants to “detect mass shooters before they strike.” It won’t work.” Vox.com. See full text.Abstract
    New article on Vox highlights the work of Desmond Patton, Technology and Human Rights Fellow.

    Desmond Patton, a Technology and AI fellow at the Carr Center, emphasized that current AI tools tend to identify the language of African American and Latinx people as gang-involved or otherwise threatening, but consistently miss the posts of white mass murderers.

    "I think technology is a tool, not the tool," said Patton. "Often we use it as an escape so as to not address critical solutions that need to come through policy. We have to pair tech with gun reform. Any effort that suggests we need to do them separately, I don’t think that would be a successful effort at all.”

    Read the full article here

    The Physics of Dissent and the Effects of Movement Momentum
    Erica Chenoweth and Margherita Belgioioso. 8/5/2019. “The Physics of Dissent and the Effects of Movement Momentum.” Nature Human Behaviour. See full text.Abstract
    How do ‘people power’ movements succeed when modest proportions of the population participate?

    Here we propose that the effects of social movements increase as they gain momentum. We approximate a simple law drawn from physics: momentum equals mass times velocity (p = mv). We propose that the momentum of dissent is a product of participation (mass) and the number of protest events in a week (velocity). We test this simple physical proposition against panel data on the potential effects of movement momentum on irregular leader exit in African countries between 1990 and 2014, using a variety of estimation techniques. Our findings show that social movements potentially compensate for relatively modest popular support by concentrating their activities in time, thus increasing their disruptive capacity. Notably, these findings also provide a straightforward way for dissidents to easily quantify their coercive potential by assessing their participation rates and increased concentration of their activities over time.

    Read the full article here

Pages