
About This Episode
In this episode, our guest, Dr. Safiya U. Noble, author of Algorithms of Oppression: How Search Engines Reinforce Racism, discusses the impact of algorithmic injustice. She unpacks the hidden biases built into the platforms we use daily. This conversation sheds light on why reclaiming the power to tell our own stories is essential in the digital age.
About Dr. Safiya U. Noble
Dr. Safiya U. Noble is the David O. Sears Presidential Endowed Chair of Social Sciences and Professor of Gender Studies, African American Studies, and Information Studies at the University of California, Los Angeles (UCLA). She is the Director of the Center on Resilience & Digital Justice and Co-Director of the Minderoo Initiative on Tech & Power at the UCLA Center for Critical Internet Inquiry (C2i2). She currently serves as a Director of the UCLA DataX Initiative, leading work in critical data studies for the campus. Professor Noble is the author of the best-selling book on racist and sexist algorithmic harm in commercial search engines, entitled Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press), which has been widely reviewed in scholarly and popular publications. In 2021, she was recognized as a MacArthur Foundation Fellow for her groundbreaking work on algorithmic discrimination.
In her words…
Technology of this kind doesn’t “look at the history of discrimination against African-Americans, or people of color, or poor people, or women. That’s not a mitigating factor in the way in which the systems are programmed. So this is one of the huge opportunities when we think about what technology could be in our society.”
“Imagine if we had more pro-social, pro-human rights, pro-civil rights kinds of technology design that accounted for histories of discrimination and oppression. Then we wouldn’t actually have the prevalence of technological redlining that we do and at the rate that we have it now.”
“We have to think about other ways of living and doing and being that can resist these totalizing kinds of systems that are about technological redlining or they are algorithms of oppression. And that might even mean that we have a whole pivot. I can see and I can believe that in my own lifetime, we might have a radical pivot away from technology and doing things differently because of a sense of future and possibility that is really dependent upon it.”
“We need deeper investment in pro-social, rights-protecting kinds of technologies and communications systems.”
“We underestimate the power of the local. We need to cultivate a range of skills, a range of networks, and a range of access points to each other.”
Questions Answered on this Episode
- In the context of our theme, Who Shapes the Story?, how do you see algorithms acting as hidden storytellers in society?
- From a perspective of social change, what should the person who is trying to create a narrative, or share a narrative, be thinking of? How can they be mindful, understanding the constraints of this particular platform? What do they need to know?
- For the person consuming the media, what can they do to sort of attempt to sift through fact or fiction, knowing that the algorithm is only feeding them certain things? How can they do their own due diligence within that? Is there anything folks can be thinking about?
- How should the platforms be leveraged? How should people be thinking about themselves as part of an ecosystem of storytellers?
- Can you talk to me a little bit about your definition of technological redlining? I think it’s really important for people to hear that term and to understand it. And hopefully, put some structure around that from a storytelling context.
- What advice would you give if you were sitting in an auditorium of communications leaders, or CEOs of foundations and nonprofits, who don’t want this story to die? What advice would you give to them based on everything you know and understand about the platforms? What’s at risk, and what are the opportunities?