Search

Did you mean
fellow

Search results

    Is Your Phone Tainted by the Misery of the 35,000 Children in Congo's Mines?
    Siddharth Kara. 10/12/2018. “Is Your Phone Tainted by the Misery of the 35,000 Children in Congo's Mines?” The Guardian. Publisher's VersionAbstract
    In his recent article in The Gaurdian, Senior Fellow Siddharth Kara discusses the human rights violations connected to the cobalt industry. 

    My field research shows that children as young as six are among those risking their lives amid toxic dust to mine cobalt for the world’s big electronics firms  -Siddharth Kara, Senior Fellow, Carr Center

    "Until recently, I knew cobalt only as a colour. Falling somewhere between the ocean and the sky, cobalt blue has been prized by artists from the Ming dynasty in China to the masters of French Impressionism. But there is another kind of cobalt, an industrial form that is not cherished for its complexion on a palette, but for its ubiquity across modern life.

    This cobalt is found in every lithium-ion rechargeable battery on the planet – from smartphones to tablets to laptops to electric vehicles. It is also used to fashion superalloys to manufacture jet engines, gas turbines and magnetic steel. You cannot send an email, check social media, drive an electric car or fly home for the holidays without using this cobalt. As I learned on a recent research trip to the Democratic Republic of the Congo, this cobalt is not awash in cerulean hues. Instead, it is smeared in misery and blood."

    Elodie is 15. Her two-month-old son is wrapped tightly in a frayed cloth around her back. He inhales potentially lethal mineral dust every time he takes a breath. Toxicity assaults at every turn; earth and water are contaminated with industrial runoff, and the air is brown with noxious haze. Elodie is on her own here, orphaned by cobalt mines that took both her parents. She spends the entire day bent over, digging with a small shovel to gather enough cobalt-containing heterogenite stone to rinse at nearby Lake Malo to fill one sack. It will take her an entire day to do so, after which Chinese traders will pay her about $0.65 (50p). Hopeless though it may be, it is her and her child’s only means of survival.

    Read the full article in The Guardian.

    Critical Skill for Nonprofits in the Digital Age: Technical Intuition
    Alix Dunn. 5/7/2019. “Critical Skill for Nonprofits in the Digital Age: Technical Intuition.” Stanford Social Innovation Review. Listen to the Interview.Abstract

    Not everyone needs to become a tech expert, but all activists and nonprofit leaders must develop skills to inquire about, decide on, and demand technological change. Tech Fellow Alix Dunn talks to Stanford's Social Innovation Podcast. 

    In a world where the pace of organizational learning is often slower than the pace of technological change, activists and nonprofit leaders must develop their “technical intuition.” Not everyone needs to become a tech expert, explains Alix Dunn, of the consulting firm Computer Says Maybe, but this ongoing process of imagining, inquiring about, deciding on, and demanding technological change is critical.

    In this recording from the Stanford Social Innovation Review's 2019 Data on Purpose conference, Dunn walks through her guidelines to help anyone to develop these skills.

    The Ethical Use of Personal Data to Build Artificial Intelligence Technologies: A Case Study on Remote Biometric Identity Verification
    Neal Cohen. 4/4/2020. “The Ethical Use of Personal Data to Build Artificial Intelligence Technologies: A Case Study on Remote Biometric Identity Verification.” Carr Center Discussion Paper Series, 2020-004. See full text.Abstract
    Artificial Intelligence (AI) technologies have the capacity to do a great deal of good in the world, but whether they do so is not only dependent upon how we use those AI technologies but also how we build those AI technologies in the first place.

    The unfortunate truth is that personal data has become the bricks and mortar used to build many AI technologies and more must be done to protect and safeguard the humans whose personal data is being used. Through a case study on AI-powered remote biometric identity verification, this paper seeks to explore the technical requirements of building AI technologies with high volumes of personal data and the implications of such on our understanding of existing data protection frameworks. Ultimately, a path forward is proposed for ethically using personal data to build AI technologies.

    Read the paper here. 

    Digital Identity in the Migration & Refugee Context: Italy Case Study
    Mark Latonero, Keith Hiatt, Antonella Napolitano, Giulia Clericetti, and Melanie Penagos. 4/2019. Digital Identity in the Migration & Refugee Context: Italy Case Study. Data & Society. Data & Society. See full text.Abstract
    New Report by Carr Center Technology and Human Rights Fellow Mark Latonero.

    "Increasingly, governments, corporations, international organizations, and nongov-ernmental organizations (NGOs) are seeking to use digital technologies to track the identities of migrants and refugees. This surging interest in digital identity technologies would seem to meet a pressing need: the United Nations Refugee Agency (UNHCR) states that in today’s modern world, lacking proof of identity can limit a person’s access to services and socio-economic participation, including employment opportunities, housing, a mobile phone, and a bank account. But this report argues that the tech-nologies and processes involved in digital identity will not provide easy solutions in the migration and refugee context. Technologies that rely on identity data introduce a new sociotechnical layer that may exacerbate existing biases, discrimination, or power imbalances.How can we weigh the added value of digital identification systems against the potential risks and harms to migrant safety and fundamental human rights? This report provides international organizations, policymakers, civil society, technologists, and funders with a deeper background on what we currently know about digital identity and how migrant identity data is situated in the Italian context. "
    We Can't Future-Proof Technology. But Here are 5 Ways to Forward Plan.
    Alexa Koenig and Sherif Elsayed-Ali. 1/5/2019. “We Can't Future-Proof Technology. But Here are 5 Ways to Forward Plan.” World Economic Forum . See full text.Abstract
    New article co-authored by Carr Center Technology and Human Rights Fellow Sherif Elsayed-Ali.

    "We know that the technologies of the Fourth Industrial Revolution are drastically changing our world. This change is happening at a faster rate and greater scale than at any point in human history – and with that change come significant challenges to the ability of our public institutions and governments to adequately respond.

    From the plough to vaccines to computers, technological innovations have generally made human societies more productive. Over time, people have figured out how to mitigate their negative aspects. For example, electrical applications are much safer to use now than in the early days of electrification. Though we came close to disaster, since the Second World War the international political system has managed to contain the threat of nuclear weapons for mass destruction.

    However, the accelerating pace of change and the power of new technologies mean that negative unintended consequences will only become more frequent and more dangerous. What can we do today to help ensure that new technologies make life better, not worse?"

    https://www.weforum.org/agenda/2019/01/how-to-plan-for-technology-future-koenig-elsayed-ali/

    From Rationality to Relationality: Ubuntu as an Ethical and Human Rights Framework for Artificial Intelligence Governance
    Sabelo Mhlambi. 7/8/2020. “From Rationality to Relationality: Ubuntu as an Ethical and Human Rights Framework for Artificial Intelligence Governance.” Carr Center Discussion Paper Series, 2020-009. See full text.Abstract

    What is the measure of personhood and what does it mean for machines to exhibit human-like qualities and abilities? Furthermore, what are the human rights, economic, social, and political implications of using machines that are designed to reproduce human behavior and decision making? The question of personhood is one of the most fundamental questions in philosophy and it is at the core of the questions, and the quest, for an artificial or mechanical personhood. 

    The development of artificial intelligence has depended on the traditional Western view of personhood as rationality. However, the traditional view of rationality as the essence of personhood, designating how humans, and now machines, should model and approach the world, has always been marked by contradictions, exclusions, and inequality. It has shaped Western economic structures (capitalism’s free markets built on colonialism’s forced markets), political structures (modernity’s individualism imposed through coloniality), and discriminatory social hierarchies (racism and sexism as institutions embedded in enlightenment-era rationalized social and gender exclusions from full person status and economic, political, and social participation), which in turn shape the data, creation, and function of artificial intelligence. It is therefore unsurprising that the artificial intelligence industry reproduces these dehumanizations. Furthermore, the perceived rationality of machines obscures machine learning’s uncritical imitation of discriminatory patterns within its input data, and minimizes the role systematic inequalities play in harmful artificial intelligence outcomes.

    Read the full paper.

    Conference Report: Technology & Human Rights in the 21st Century
    Steven Livingston and Sushma Raman. 2/21/2017. “Conference Report: Technology & Human Rights in the 21st Century.” Technology & Human Rights in the 21st Century. Carr Center for Human Rights Policy, Harvard Kennedy School, 79 JFK Street, Cambridge, MA: Carr Center for Human Rights Policy. See full text.Abstract
    Technology & Human Rights in the 21st Century:
     

    On November 3 - 4, 2016, the Carr Center for Human Rights Policy at the Harvard Kennedy School hosted a symposium that aimed to:

    1. Strengthen collaboration among stakeholders working on issues at the intersection of human rights and technology and

    2. Deepen our understanding of the nature of collaboration among different technical and scientific communities working in human rights.

    The symposium brought together practitioners and academics from different industries, academic disciplines and professional practices. Discussion centered on three clusters of scientific and technical capacities and the communities of practice associated with each of them. These clusters are:

    • Geospatial Technology: The use of commercial remote sensing satellites, geographical information systems (GIS), unmanned aerial vehicles (UAVs) and geographical positioning satellites (GPS) and receivers to track events on earth.
       
    • Digital Networks: The use of digital platforms to link individuals in different locations working towards a common goal, such as monitoring digital evidence of human rights violations around the world. It often involves crowdsourcing the collection of data over digital networks or social computation – the analysis of data by volunteers using digital networks.
       
    • Forensic Science: The collection, preservation, examination and analysis of evidence of abuses and crimes for documentation, reconstruction, and understanding for public and court use. Among the more prominent evidential material in this area includes digital and multimedia evidence as well as corporal and other biologic evidence.  When considering the use of digital technologies, we might say that forensic science involves the recoding of material objects into binary code. This domain includes massively parallel DNA sequencing technologies as well as document scanning and data management technologies.

    In their landmark 1998 book, Activists Beyond Borders, Kathryn Sikkink and Margaret Keck wrote that “by overcoming the deliberate suppression of information that sustains many abuses of power, human rights groups bring pressure to bear on those who perpetuate abuses” (Keck and Sikkink, 1998, Kindle Locations 77-78).  The Carr Center’s symposium on technology and human rights explored the ways modern human rights organization use science and technology to overcome the deliberate suppression of information.

    Speakers discussed the latest advances in each of the key technologies represented at the symposium and used today by human rights organizations.

    Steven Livingston and Sushma Raman co-organized the event. Livingston is Senior Fellow at the Carr Center and Professor of Media and Public Affairs and Professor of International Affairs at the George Washington University; Raman is the Executive Director of the Carr Center at the Harvard Kennedy School of Government.

    Full online version here.

     

    data_03_01

    Study Group: Data Trusts | An Ethical Pathway to Protect the Human Rights of People Living with Criminal Convictions Impacted by Background Screening?

    February 14, 2020

    The Carr Center for Human Rights Policy invites you to join a study group on the urgent need to establish a human rights framework in criminal justice reform, which addresses mass incarceration in America.... Read more about Study Group: Data Trusts | An Ethical Pathway to Protect the Human Rights of People Living with Criminal Convictions Impacted by Background Screening?

Pages