Don’t Hate the Player

Fixing Bias in AI: What We Can Learn from Texas Women
By Heather Shoemaker, Founder & CEO
Table of Contents
Last week I spoke at the inaugural Texas Women’s Leadership Conference in Austin. I don’t live in Texas, as do most of the 800 women in attendance. This Wyomingite was in Texas because Jinous Rouhani, a member of the Texas Governor’s Commission for Women, contacted me on LinkedIn after I spoke at South by Southwest (SXSW) on the topic of gender and culture bias in AI. Jinous said she didn’t know many women in AI but she sensed that it needed to be addressed at her conference. So the night before my panel, I lay awake in my very nice bed at the AT&T Hotel, Conference & Event Center, thinking about what I would say.
Nevermind that I had already memorized answers to the scripted panel questions weeks ago. As the founder and CEO of Language IO, an AI company that provides multilingual customer support technology, I speak quite a bit about AI and bias in AI. Our red-teaming strategy to detect bias in a well-known commercial LLM was published in Fortune and I’ve been quoted in Forbes on the topic.
But this presentation would be different.
Normally, I present at tech conferences, progressive stages such as SXSW and most recently in the Netherlands. I was named one of the top 100 women in AI at an NYC awards ceremony late last year. As a female technologist, I care quite a bit about how AI will affect women and girls. As a white woman, I also recognize that my experience with sexism is very different from that experienced by a woman of color or by more marginalized groups than mine.
Equitable AI isn’t about removing bias in the abstract. It’s about recognizing that bias is experienced differently — and designing systems that honor those differences instead of erasing them. But when it came down to what I should say about it in a conservative southern state, I worried about how the topic would land.
It’s Not the Player. It’s the Game.
We are living through a transition, a moment in which generative AI is revolutionizing how we function as a society. If women in positions of leadership don’t lean in and participate in the design and evolution of this self-learning technology, the bias that has plagued women for centuries in real life (IRL) interactions is going to propagate at scale. If AI is the game, we are the players.
There’s a popular saying that goes “Don’t hate the player, hate the game” made famous by Ice T’s 1999 song. It suggests that individuals should not be criticized for taking advantage of the rules of a flawed system. In the case of AI, the game is most definitely flawed and I was in Austin to talk about how the players, specifically Texas women, could take advantage of the rules to make it more equitable.
As I rode up the escalator to the conference the next morning, I wore my comfortable boots, a monochrome chocolate-brown pants suit and 17-year-old-daughter-approved layered necklaces (“Mom, Texas is fancy. You’re not wearing jeans are you?”). From what I could see, every other woman on that escalator was in three-inch heels and skirts with their hair perfectly blown out and makeup artistically applied.
On a premonition, I had left my practical, black belt bag at home. But the red Coach purse I did bring was sitting in the speakers’ green room surrounded by Birkins. Much like myself on this escalator. The gift bags given to us by the organizers weighed more than the luggage I’d brought with me. All of this fueled my growing concern. Do coiffed women in skirts and heels carrying $30,000 bags care about gender inequities in AI? In society?
As my co-panelists and I headed backstage in preparation for our session “Powering the Future: Women Leading Tech & Emerging Innovation,” I walked next to fellow panelist Amina Al Sherif, Generative AI Lead for Google’s Public Sector. I was more than a little relieved to see that she was wearing comfortable looking moccasins.
Now if I’m being honest, I regularly wore high heels back when Ice-T dropped “Don’t Hate the Playa,” and I’m in no way saying that every woman who wears heels is oppressed. I personally stopped wearing them because high heels make me feel at a physical disadvantage, especially in front of 800 people I don’t know. But gender bias goes a lot deeper than heels, which are the least of my concerns when it comes to AI.
Generative AI as a Reflection of Society
Generative AI is a reflection of society – or at least the content that society has published on the Internet because that’s the content that generative AI is trained on. As a result, AI reflects all the gender and cultural bias that exists within the segments that authored that content. Ninety percent of Internet content is English – much of it American English. When it comes to AI, English-speaking, cisgender, white male points of view are disproportionately represented. But let’s back up to pre-AI and address the fact that for centuries, women and marginalized populations have faced significant bias during in-person interactions.
Whether it’s a medical care decision, the hiring process at a company or employee performance reviews, we know these processes are rife with bias. Multiple studies by prestigious universities have demonstrated that generative AI reflects society’s inherent gender and cultural biases.
A 2024 study published by UNESCO, involving researchers from universities around the globe, found that LLMs reinforced gender-based stereotypes to the extent that when asked to talk about male professions and female professions, men were assigned diverse roles such as teacher, doctor and driver. Women, on the other hand, were assigned jobs such as prostitute, domestic servant and cook.
A Cornell-University-led study published in 2024 found that while cultural values and traditions differ across the globe, the large language models (LLMs) that power generative AI have a tendency to reflect values from English-speaking and Protestant European countries.
Containing Bias
At Language IO, because we provide AI technology for multilingual customer support, we are in the business of making sure AI output is culturally adapted and does not just reflect English-speaking, white, Protestant ideals. But the companies who use the Language IO platform care about gender inclusivity as well, so all of their customers – regardless of gender – come away with a positive customer experience. To make this happen, we employ the tools of the AI game that allow us to move the needle toward a more equitable experience, regardless of culture, race or gender.
The beauty of AI is that unlike biased interactions that occur IRL, patterns of bias in AI output are observable and measurable. It’s all right there in front of us with AI. The down side is that if nothing is done to solve for the inequities that we see, these inequities will propagate faster than they ever could IRL. But generative AI simultaneously provides us with tools to filter and fine tune this output in a way that can tip the scales in favor of equality. As the UNESCO study points out:
“AI could potentially advance the aims of gender equality and equity worldwide if, for instance, it is harnessed ethically and inclusively, or if it is developed by diverse teams which aim for positive societal impacts, and more generally, if it is designed to mitigate — rather than perpetuate — inequality and gender disparity in its interactions with society.”
So once we four panelists had taken our seats on-stage, in front of a sea of Texas women, the moderator eventually asked about guardrails necessary to ensure that AI is inclusive and trustworthy. In an earlier question directed to me, I was asked to describe what it was like being one of an extreme minority of women in corporate tech.
When I talked about the fact that none of my male colleagues had to pump breast milk in stinky bathroom stalls multiple times a day like I did, this resonated with the audience. There was laughter and clapping. But when the “guardrails” question came and I talked about the research demonstrating culture and gender bias in AI and how important it is for female leaders to lean in and use prompts and training mechanisms to correct the biased behavior, the audience went quiet.
I tried to brush off the apparent lack of interest in AI bias. These women weren’t, after all, at a tech conference. This wasn’t SXSW or the Netherlands. It was Texas. But I would be lying if I said I wasn’t disappointed. After the conference we got a group photo and headed back to the green room where I rescued my poor Coach bag from the Birkins and started to pack up.
But then the ladies in high heels approached.
One of them said that after hearing our panel, she was simultaneously more scared about AI but also reassured that we have the means to fix things. We talked about what we should let our kids have access to. They loved hearing about how my teenage daughter helped me find a fancy-enough outfit for their audience, given that Texas fashion expectations would be a bit more elevated than those in Wyoming.
After I flew home, women from that Austin audience continued to reach out to me on LinkedIn. Angie Smith, a best selling author in attendance, sent me the most memorable message: “Oh my goodness, Heather, you were amazing!! I had so much fun listening to your perspective on the panel and it was fun to see you and Amina navigate your discussion. Gosh, when you shared about pumping in the bathroom, I resonated with you on that. I would love to have a virtual coffee with you in the next couple of weeks.”
In the spirit of the ICE-T song, I have begun to feel like these women are truly impressive players in the game. I can’t hate them for looking great, donning three-inch heels and working the unique rules of Texas to their own advantage. I came away with great respect for the advice from other panelists on what it means to be a good leader. A leader is only as good as their team. A leader takes time for self care. A leader recognizes when it’s time to let someone else lead. Sometimes a leader struts onto that stage in high heels and feels powerful. Maybe next time that will be me.
A leader is only as good as their team. A leader takes time for self care. A leader recognizes when it’s time to let someone else lead. Sometimes a leader struts onto that stage in high heels and feels powerful. Maybe next time that will be me.
Discover More
-
Why the Default CRM Translation Solution Breaks Down in Global Customer Support
Today’s generative AI models have revived the assumption that If it sounds good, it must be correct. But the reality is that translation accuracy depends heavily on context, not just linguistic ability.
-
How AI Can Fix the 5 Biggest Pain Points in Customer Service
The truth is simple: You can’t create great customer experiences if your agents are miserable. When an agent is stressed out, jumping between six different programs, and watching the clock tick, they can’t give customers the attention they deserve. All their energy goes into just managing the technology instead of actually helping people. If we…


