BIZTECH: Speaking of AI, businesses of all sizes are in a race to roll out these initiatives. But done too hastily, the execution can backfire. What are some of the dangers of not having inclusive voices in the AI development and engineering process?
MANCINI: Think about the fallout from the Meta “Liv” chatbot, who is supposed to be a queer, Black woman. However, while engaging with Liv, users discovered that Liv also loves fried chicken, collard greens and spilling the tea. There was backlash and speculation that Liv was developed by white men who would say anything to drive engagement.
DISCOVER: Get nonprofit technology solutions that can help your team.
BIZTECH: You’re referring to reviewers who called Liv “digital blackface?”
MANCINI: Yes. Now, think of how dangerous that becomes in mental health, in banking and in schools. The Washington Post recently reported about wrongful arrests that were driven by face recognition technology. Another study from Forbes found that AI bias caused lenders to be 80% more likely to reject Black applicants.
BIZTECH: What are some ways that IT leaders could reduce AI bias?
MANCINI: If you just remove race from coding, we’re back on track — but it’s not quite that simple. Chatbots are trained on an aggregation of data around cultural vernacular, language and keywords. There’s an incredible girl on LinkedIn named Aliyah Jones who tested racial bias. After a period of unemployment with no interviews, she decided to create a catfish, fake profile, and suddenly her interview invitation rate went up to 57%. Why? According to the National Bureau of Economic Research, ethnic names receive 50 percent fewer callbacks for interviews.
BIZTECH: Is it all AI’s fault, or are we also partly to blame? If AI models are supposed to replicate our experiences, how can we govern that data so it’s more accurate?
MANCINI: These technologies are far from perfect. Right now, these AI tools are only given the information they have based on past history. Why are we not doing a better job of being thoughtful about creating spaces where we have a voice and we own our own data?
That’s why women of all backgrounds need to be in the room to make sure that a full, dimensional data set is there. Google recently rolled out its Pixel campaign, which recognized more skin tones, and this is ultimately the difference between skin cancer being detected or not on darker skin. If you’re building out the data sets, it’s like a story, and you can see when someone is missing.
FIND OUT: How are women in tech changing the world at CDW?
BIZTECH: In terms of this identification piece, what steps can IT leaders take to help ensure that Black and brown women see themselves in this emerging technology?
MANCINI: The work has to continue. We need technologies that don’t put us in a box, that create space for us to grow. And everyone needs to have a say in how the technology is developed.
And it happens not by one rags-to-riches story of one woman who is given a rare opportunity. It’s gradual and additive. When technologies are built by us for everyone, things change. When you think of what a future could look like with Black and brown women involved in AI development, it’s about creating technology that is safe, inclusive, equitable and joyful for everyone.
BIZTECH: What would you say to skeptics who say AI is too advanced? Or that the system isn’t set up to allow for this level of change?
MANCINI: It’s not too late to do this. I spoke recently at Princeton, and I asked if there was a Sam Altman in the room, and there was like an audible gasp. Why not? When I ask people who’s coding their future, the question is, should it be you?