Algorithmic Justice League launches new campaign to prevent facial recognition software industry from selling or buying tech that can be weaponized
BOSTON, Dec. 11, 2018 /PRNewswire/ -- English police stopped a black man in London last July after facial recognition software misidentified him. Police demanded the man's ID, emptied his pockets and searched him in front of a crowd of onlookers. The man hadn't committed a crime and wasn't suspected of any.
Get ready for that public humiliation or worse to happen here. Police departments, airports, schools, businesses and other entities across the United States and around the world are using facial analysis software to identify suspects, track movement and activities, take attendance and advertise products.
But Joy Buolamwini has a plan to preclude people from being stopped and frisked or killed because of faulty software. The founder of the Algorithmic Justice League, and an AI researcher at MIT, aims to prevent facial analysis technology from leading to collateral damage.
"Computer vision uses machine-learning techniques to do facial analysis," says Buolamwini, named to the recent Bloomberg 50 list for her 2018 accomplishments. "You create a training set with examples of faces. However, if the training sets aren't diverse, any face that deviates too much from the established norm will be harder to detect, identify, or classify for attributes like age. With the errors, biases and lack of oversight, companies should have more accountability."
Accountability means that artificial intelligence (AI) vendors and clients commit to not allow the technology to be used for lethal targeting or other abuse, and continually monitor AI for racial, gender, and other harmful bias. In her New York Times op-ed on the dangers of facial analysis technology and during Federal Trade Commission hearings on AI, Buolamwini called for federal regulations.
Now, she is urging public and private organizations including NEC, IBM, Microsoft, Google, Facebook Amazon, Megvii, and Axon to sign the Safe Face Pledge. Three producers of facial analysis software, Robbie.AI, Yoti and Simprints, already have confirmed that they will sign the Safe Face Pledge. The pledge specifically requires them to:
- Show Value for Human Life, Dignity, and Rights
- Do not contribute to applications that risk human life
- Do not facilitate secret and discriminatory government surveillance
- Mitigate law enforcement abuse
- Ensure your rules are being followed
- Address Harmful Bias
- Implement internal bias evaluation processes and support independent evaluation
- Submit models on the market for benchmark evaluation where available
- Facilitate Transparency
- Increase public awareness of facial analysis technology use
- Enable external analysis of facial analysis technology on the market
- Embed Safe Face Pledge into Business Practices
- Modify legal documents to reflect value for human life, dignity, and rights
- Engage with stakeholders
- Provide details of Safe Face Pledge implementation
"Audits of facial analysis systems show the technology is better at reading male faces than female faces, and more accurately classifies lighter faces than darker faces," says Buolamwini. "My research at MIT which audited IBM, Microsoft, and Megvii showed error rates as high as 35 percent for classifying dark skinned women."
In July, the ACLU tested Amazon's facial analysis software, Rekognition, using photos of every member of the House and Senate. The software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime. The false matches included Republicans and Democrats of all ages, but were disproportionately people of color. In addition to members of Congress, Rekognition has even been shown to misclassify Oprah Winfrey.
Amazon currently is selling Rekognition to police departments.
On board with the project is the Center on Privacy & Technology at Georgetown Law, a think tank that researches government use of facial recognition technology and its disparate impact on racial and ethnic minorities. "We study police use of face recognition, and all too often we find that this technology is being used with little or no accountability, oversight, and transparency. In many instances the vendors themselves are the best situated to know who is using automated facial analysis tools, for what purposes—and to anticipate and prevent uses that are harmful or irresponsible," says Laura Moy, executive director of the Center. "We're pleased that with this pledge, vendors are publicly recognizing that they have an opportunity—and a responsibility—to do the right thing here." The Center's 2016 report, The Perpetual Line-Up, outlines how agencies across the country use the technology and offers policy recommendations, including model state and federal legislation.
Among other civil liberties groups and advocates supporting Buolamwini's effort is Data4BlackLives, a group of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black people, Noel Sharkey, principle spokesperson for the Campaign to Stop Killer Robots, a coalition of NGOs that is working to ban fully autonomous weapons and thereby retain human control over the use of force, and PolicyLink, a national research and action institute advancing racial and economic equity.
"Research shows facial analysis technology is susceptible to bias and even if accurate can be used in ways that breach civil liberties. Without bans on harmful use cases, regulation, and public oversight, this technology can be readily weaponized, employed in secret government surveillance, and abused in law enforcement," warns Buolamwini.
Visit www.safefacepledge.org to read more about the project, and www.ajlunited.org to learn more about the Algorithmic Justice League's research into the social impact of artificial intelligence.
SOURCE Algorithmic Justice League
Related Links
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article