Advances in technology provide all kinds of benefits, but also introduce risks — especially to already marginalized populations. AI for the People’s Mutale Nkonde, disability rights lawyer Haben Girma, and author of Algorithms of Oppression Safiya Umoja Noble have studied and documented these risks for years in their work. They joined us at TC Sessions: Justice 2021 to talk about the deep origins and repercussions of bias in tech, and where to start when it comes to fixing them.
On bias in tech versus bias in people
When it comes to identifying bias in tech, there are two ways of coming at it: the tech itself and the people who are putting it to work. A facial recognition system may be be racist itself (such as working poorly with dark skin) or used in furtherance of racist policies (like stop and frisk).
Nkonde: There is the problem of technologies which are inherently racist, or sexist, or ableist, as Haben so beautifully pointed out. But there is another part… an imagination for technologies that could actually serve all people. And if the if the scientists who are creating those technologies don’t have experience outside of their own experiences, and we’re sitting in a moment where Google AI has got rid of [Margaret] Mitchell and Timnit Gebru, both of whom were technologists from, researchers from, minoritized communities who are thinking about new and different ways that tools could be designed… then you may not see them coming to products. I’d say that the two are definitely married. (Timestamp: 3:00)
- Twitter and Zoom’s algorithmic bias issues
- Google fires top AI ethics researcher Margaret Mitchell
- What’s next for Dr. Timnit Gebru
On the danger in ‘banal’ technologies
Bias does not only exist in controversial tech like facial recognition. Search engines, algorithmic news feeds, and other things we tend to take for granted also can contain harmful biases or contribute to them.
Noble: My concerns were with what we might think of as just banal technologies, things that we really don’t give a second thought to, and that also present themselves as widely neutral, and valuable. Of course this is where I became interested in looking at Google search, because Google’s own kind of declaration that they were interested in organizing all the world’s knowledge, I think was a pretty big claim. I’m coming out of the field of Library and Information Science and thinking about, I don’t know, thousands of years of librarians, for example, around the world, who have been indeed organizing the world’s knowledge, and what it means to have an advertising company, quite frankly, data mine our knowledge, but also commingle it with things like disinformation, propaganda, patently false information and ideas, and really flatten our ability to understand knowledge and good information. (Timestamp: 5:13)
- Google’s latest user-hostile design change makes ads and search results look identical
- Google threatens to close its search engine in Australia as it lobbies against digital news code
On how excluding groups harms them twice over
Haben Girma, who is deaf and blind, has advocated for accessibility with the skills she learned at Harvard Law. But the lack of accessibility goes deeper than simply not captioning images properly and other small tasks.
Girma: So most of the technology that’s built was not imagined for disabled people, which is frustrating… and also absolutely ridiculous. Tech has so much potential to exist in visual forms, in auditory forms, in tactile forms, and even smell and taste. It’s up to the designers to create tools that everyone can use. (Timestamp: 0:56)
A disturbing viral trend on TikTok recently questioned the story of deafblind icon Helen Keller. Doubt that she existed as described or did the things she did was widespread on the platform — and because TikTok is not designed for accessibility, others like Keller are excluded from the conversation and effectively erased from consideration in addition to being the subject of false claims.
Girma: Deafblind people have used technology for quite a while, and were early users of technology, including being designers and engineers. We are on many of the social media platforms, there are blind and deaf blind people on Twitter. TikTok was not designed with accessibility in mind.
When you have a space where there are few disabled people, ableism grows. People on TikTok have questioned the existence of Helen Keller, because the people on the platform can’t imagine how a deafblind person would write a book, or travel around the world. Things that are well documented that Helen Keller did. And there’s also lots of information on how blind and deaf blind people are doing these things today, writing books today, using technology today. So when you have these spaces where there are no disabled people, or very few disabled people, ableism and negative biases grow more rapidly. And that’s incredibly harmful, because the people there are missing out on talented, diverse voices. (Timestamp: 12:16)
- White House, dark mode: Biden admin refreshes presidency’s website, vows accessibility
- Evinced raises $17M to speed up accessibility testing for the web
On tech deployed against black communities
The flip side of racism within tech is ordinary tech being used by racist institutions. When law enforcement employs “objective” technology like license plate readers or biometric checks, they bring their own systematic biases and troubling objectives.
Nkonde: One of the things that that really brought me to was this whole host of technologies that when used by security forces, or police, reinforce these discriminatory impacts on black communities. So that could be the way license plate readers were used by ICE to identify cars, and when they pulled people over, they would do these additional biometric checks, whether it was fingerprinting or iris readers, and then use that to criminalize these people onto the road to deportation. (Timestamp: 17:16)
And when the two forms of bias are combined, certain groups are put at serious disadvantage:
Nkonde: We’re seeing how all of these technologies on their own, are impacting black lives, but imagine when all of those technologies are together, imagine when, here in New York, I walked to the subway to take a train because I have to go to work. And my face is captured by a CCTV camera that could wrongly put me at the scene of a crime because it does not recognize my humanity, because black faces are not recognized by those systems. That’s a very old idea that really takes us back to this idea that black people aren’t human, they’re in fact three fifths of a human, which was at the founding of this country, right? But we’re reproducing that idea through technology. (Timestamp: 19:00)
- For Seattle’s cop-free protest zone, tech is both a revolutionary asset and disastrous liability
- Racial disparity in Chicago cops’ use of force laid bare in new data
On the business consequences of failing to address bias and diversity
While companies should be trying to do the right thing, it may help speed things up if there’s a financial incentive as well. And increasingly there is real liability resulting from failing to consider these problems. For instance, if your company produces an AI solution that’s found to be seriously biased, you not only lose business but may find yourself the subject of civil and government lawsuits.
Noble: I think that first of all, there’s a tremendous amount of risk by not taking up these issues. I’ve heard that the risk management profile, for example for a company like Facebook, in terms of harm, what they can’t solve with software and AI, that they use human beings, quite frankly to sort through, for example, the risk that they face is probably estimated around $2 billion, right?
If you’re talking about a $2 billion risk, I think then this is a decision that exceeds the design desires and software engineers. (Timestamp 24:25)
Not just bias but unintended consequences need to be considered, such as how an app or service may be abused in ways the creators might not have thought of.
Noble: I think you have to think far beyond, you know, like, what you can do versus what you should do, or what’s ethical and responsible to do and I think these conversations now can no longer be avoided. This is a place where founders, venture capitalists, everything, every VC in the Valley on Sandhill road should have a person who is responsible for thinking about the adverse effects of the products that they might invest in. (Timestamp: 25:43)
- DE&I at Facebook, Prop 22 and gig worker earnings
- Facebook fined again in Italy for misleading users over what it does with their data
On getting people in the room before, not after the crisis
The tendency to “ship it and fix it” rather than include accessibility from the ground up is increasingly being questioned by both advocates and developers. Turns out it’s better for everyone, and cheaper in the long run, to do it right the first time.
Girma: The answer to most of these questions is have the people involved. ‘Nothing about us without us’ is the saying in the Disability Justice Movement, so if these seas and companies are thinking about investing in a solution that they think will be good for the world? Ask disability justice advocates, get us involved. (Timestamp: 29:25)
We need the VCs to also connect with Disability Justice advocates, and really find someone who has knowledge and background in accessibility and tech. Same thing for any company. All the companies should have, technology existing and tech in the process of being built, should be consulting on accessibility. It’s easier to make something accessible if you design for accessibility, rather than trying to make it accessible afterwards. It’s like having an elevator in a physical building. You don’t build the structure, and then think about adding an elevator. You think about adding an elevator before you design it. (Timestamp: 30:55)
- Microsoft offers new accessibility testing service for PC and Xbox games
- Fable aims to make disability-inclusive design as simple as a service
Read the full transcript here.
Early Stage is the premier ‘how-to’ event for startup entrepreneurs and investors. You’ll hear first-hand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, product market fit, PR, marketing and brand building. Each session also has audience participation built-in – there’s ample time included for audience questions and discussion. Use code “TCARTICLE at checkout to get <a href=”http://techcrunch.com/events/tc-early-stage-2021-part-1?promo=tcarticle&display=true”>20 percent off tickets right here.
This article was featured first on TechCrunch Read More