Search

Search results

    The Future Impact of Artificial Intelligence on Humans and Human Rights
    Steven Livingston and Mathias Risse. 6/7/2019. “The Future Impact of Artificial Intelligence on Humans and Human Rights.” Ethics and International Affairs, 33, 2, Pp. 141-158. See full text.Abstract
    What are the implications of artificial intelligence (AI) on human rights in the next three decades?

    Precise answers to this question are made difficult by the rapid rate of innovation in AI research and by the effects of human practices on the adaption of new technologies. Precise answers are also challenged by imprecise usages of the term “AI.” There are several types of research that all fall under this general term. We begin by clarifying what we mean by AI. Most of our attention is then focused on the implications of artificial general intelligence (AGI), which entail that an algorithm or group of algorithms will achieve something like superintelligence. While acknowledging that the feasibility of superintelligence is contested, we consider the moral and ethical implications of such a potential development. What do machines owe humans and what do humans owe superintelligent machines?

    Read the full article here

    We Can't Future-Proof Technology. But Here are 5 Ways to Forward Plan.
    Alexa Koenig and Sherif Elsayed-Ali. 1/5/2019. “We Can't Future-Proof Technology. But Here are 5 Ways to Forward Plan.” World Economic Forum . See full text.Abstract
    New article co-authored by Carr Center Technology and Human Rights Fellow Sherif Elsayed-Ali.

    "We know that the technologies of the Fourth Industrial Revolution are drastically changing our world. This change is happening at a faster rate and greater scale than at any point in human history – and with that change come significant challenges to the ability of our public institutions and governments to adequately respond.

    From the plough to vaccines to computers, technological innovations have generally made human societies more productive. Over time, people have figured out how to mitigate their negative aspects. For example, electrical applications are much safer to use now than in the early days of electrification. Though we came close to disaster, since the Second World War the international political system has managed to contain the threat of nuclear weapons for mass destruction.

    However, the accelerating pace of change and the power of new technologies mean that negative unintended consequences will only become more frequent and more dangerous. What can we do today to help ensure that new technologies make life better, not worse?"

    https://www.weforum.org/agenda/2019/01/how-to-plan-for-technology-future-koenig-elsayed-ali/

    The Quest For Inclusive & Ethical Technology
    Sabelo Mhlambi. 6/10/2019. “The Quest For Inclusive & Ethical Technology.” WUWM Milwaukee NPR. Bonnie North. See full text.Abstract
    New interview with Technology and Human Rights Fellow Sabelo Mhlambi.

    "Most of us think of technology as a neutral force. Objects or processes are designed and implemented to solve problems and there are no biases, implied or overt, at work. But Sabelo Mhlambi says, not so fast. The computer scientist and researcher says technology cannot be neutral. What gets made, who makes it and uses it, and why is dependent upon our societies — and all societies are biased.

    "Technology will only replicate who we are," he explains. "Our social interactions will still occur online anyway. So, there’s nothing magical about technology where it somehow brings neutrality or brings equality or equity."

    https://www.wuwm.com/post/quest-inclusive-ethical-technology

    Stop Surveillance Humanitarianism
    Mark Latonero. 7/11/2019. “Stop Surveillance Humanitarianism.” The New York Times. See full text.Abstract
    Mark Latonero – Carr Center Technology and Human Rights Fellow, and research lead at Data & Society – discusses surveillance humanitarianism for The New York Times

    A standoff between the United Nations World Food Program and Houthi rebels in control of the capital region is threatening the lives of hundreds of thousands of civilians in Yemen.

    Alarmed by reports that food is being diverted to support the rebels, the aid program is demanding that Houthi officials allow them to deploy biometric technologies like iris scans and digital fingerprints to monitor suspected fraud during food distribution.

    The Houthis have reportedly blocked food delivery, painting the biometric effort as an intelligence operation, and have demanded access to the personal data on beneficiaries of the aid. The impasse led the aid organization to the decision last month to suspend food aid to parts of the starving population — once thought of as a last resort — unless the Houthis allow biometrics.

    Read the full article.

    Technological Revolution, Democratic Recession and Climate Change: The Limits of Law in a Changing World
    Luís Roberto Barroso. 9/9/2019. Technological Revolution, Democratic Recession and Climate Change: The Limits of Law in a Changing World. Carr Center Discussion Paper Series. 2019009th ed. Cambridge: Carr Center for Human Rights Policy. Publisher's VersionAbstract
     Law is a universal institution that has pretensions of being ubiquitous and complete. However, in a complex, plural and volatile world, its limits and possibilities are shaken by the speed, depth and extent of ongoing transformations, its resulting ethical dilemmas, and the difficulties of forming consensus in the political universe.

    This article provides a reflection on how the Law has attempted to deal with some of the main afflictions of our time, facing demands that include the needs to (i) keep the technological revolution on an ethical and humanistic track, (ii) avoid that democracy be perverted by populist and authoritarian adventures and (iii) prevent solutions to climate change from coming only when it is too late. At a time when even the near future has become unpredictable, Law cannot provide a priori solutions to multiplying problems and anxieties. When this happens, we must set clear goals for the future of humanity, basing them on the essential and perennial values that have followed us since antiquity.

    Trump wants to “detect mass shooters before they strike.” It won’t work.
    Sigal Samuel. 8/7/2019. “Trump wants to “detect mass shooters before they strike.” It won’t work.” Vox.Abstract

    New article on Vox highlights the work of Desmond Patton, Technology and Human Rights Fellow.

    Patton, emphasized that current AI tools tend to identify the language of African American and Latinx people as gang-involved or otherwise threatening, but consistently miss the posts of white mass murderers.

    "I think technology is a tool, not the tool," said Patton. "Often we use it as an escape so as to not address critical solutions that need to come through policy. We have to pair tech with gun reform. Any effort that suggests we need to do them separately, I don’t think that would be a successful effort at all.”

    Read full article here. 

    Is Your Phone Tainted by the Misery of the 35,000 Children in Congo's Mines?
    Siddharth Kara. 10/12/2018. “Is Your Phone Tainted by the Misery of the 35,000 Children in Congo's Mines?” The Guardian. Publisher's VersionAbstract
    In his recent article in The Gaurdian, Senior Fellow Siddharth Kara discusses the human rights violations connected to the cobalt industry. 

    My field research shows that children as young as six are among those risking their lives amid toxic dust to mine cobalt for the world’s big electronics firms  -Siddharth Kara, Senior Fellow, Carr Center

    "Until recently, I knew cobalt only as a colour. Falling somewhere between the ocean and the sky, cobalt blue has been prized by artists from the Ming dynasty in China to the masters of French Impressionism. But there is another kind of cobalt, an industrial form that is not cherished for its complexion on a palette, but for its ubiquity across modern life.

    This cobalt is found in every lithium-ion rechargeable battery on the planet – from smartphones to tablets to laptops to electric vehicles. It is also used to fashion superalloys to manufacture jet engines, gas turbines and magnetic steel. You cannot send an email, check social media, drive an electric car or fly home for the holidays without using this cobalt. As I learned on a recent research trip to the Democratic Republic of the Congo, this cobalt is not awash in cerulean hues. Instead, it is smeared in misery and blood."

    Elodie is 15. Her two-month-old son is wrapped tightly in a frayed cloth around her back. He inhales potentially lethal mineral dust every time he takes a breath. Toxicity assaults at every turn; earth and water are contaminated with industrial runoff, and the air is brown with noxious haze. Elodie is on her own here, orphaned by cobalt mines that took both her parents. She spends the entire day bent over, digging with a small shovel to gather enough cobalt-containing heterogenite stone to rinse at nearby Lake Malo to fill one sack. It will take her an entire day to do so, after which Chinese traders will pay her about $0.65 (50p). Hopeless though it may be, it is her and her child’s only means of survival.

    Read the full article in The Guardian.

    Disinformation Campaigns Target Tech-Enabled Citizen Journalists
    Steven Livingston. 3/2/2017. “Disinformation Campaigns Target Tech-Enabled Citizen Journalists.” Brookings.Abstract
    New blog post by Carr Center Senior Fellow Steven Livingston published on Brookings. 

    "Governments hoping to evade responsibility for war crimes and rights abuses are having a much tougher time of it these days. Denying entry to nettlesome investigators is still standard while many places are simply too dangerous to investigate. But even where investigators cannot go, digital technologies can sometimes overcome barriers to investigation. A recent Harvard Kennedy School report published by the Carr Center for Human Rights Policy underscores how various digital technologies undermine attempts to hide abuses and war crimes. Commercial high-resolution remote sensing satellites, some capable of distinguishing objects on the ground as small as 30-cm across, allow human rights groups to document military forces deployments, mass graves, forced population displacements, and damage to physical infrastructure."


    Read the full blog at Brookings.

    Climate Change Induced Displacement: Leveraging Transnational Advocacy Networks to Address Operational Gaps
    Steven Livingston and Joseph Guay. 2/21/2017. “Climate Change Induced Displacement: Leveraging Transnational Advocacy Networks to Address Operational Gaps.” UNHCR .Abstract
    An article on climate change and induced displacement, by Carr Center's Senior Fellow Steven Livingston and Joseph Guay. 

    According to the latest Intergovernmental Panel on Climate Change (IPCC) report, “Few aspects of the human endeavor…are isolated from possible impacts in a changing climate. The interconnectedness of the Earth system makes it impossible to draw a confined boundary around climate change impact, adaptations, and vulnerability.”1 This includes human population displacements, which amounted to a staggering 51.2 million refugees, asylum-seekers, and internally displaced people (IDPs) in 2013.2

    Unfortunately, as the frequency, duration, and intensity of extreme events affecting populations are on the rise, the humanitarian aid community is stretched thin in the face of multiple complex emergencies and protracted challenges around the world

    Read the full post.

    Can Facebook’s Oversight Board Win People’s Trust?
    Mark Latonero. 1/29/2020. “Can Facebook’s Oversight Board Win People’s Trust?” Harvard Business Review. See full text.Abstract

    Technology & Human Rights Fellow, Mark Latonero, breaks down the larger implications of Facebook's global Oversight Board for content moderation. 

    Facebook is a step away from creating its global Oversight Board for content moderation. The bylaws for the board, released on Jan. 28, lay out the blueprint for an unprecedented experiment in corporate self-governance for the tech sector. While there’s good reason to be skeptical of whether Facebook itself can fix problems like hate speech and disinformation on the platform, we should pay closer attention to how the board proposes to make decisions.

Pages