Women and AI: Emily Fiagbedzi Speaks About Women Breaking Into AI Without Code

7 Min Read

Non-technical women are not waiting at the gates of the AI industry. According to Emily Fiagbedzi, AI Startup Program Director at MEST Africa, they are precisely what the industry is missing — and the systems shaping their lives cannot afford for them to stay out.

Speaking at the Women and AI event in Accra on March 13, 2026, Fiagbedzi delivered a talk that reframed the question most women in non-technical fields are quietly asking: Is there space for me in AI? Her answer was immediate, and it reoriented the entire premise of the question.

AI is missing the context, relationships, and domain knowledge that make it work for real people in real places,” she told the audience. “Your expertise in health, law, trade, education, and agriculture is not a gap to fill before you can contribute. It IS the contribution.”

The Problem Is Not Code

Fiagbedzi opened with a provocation: “More code is not the shortage. More humanity is.” She illustrated this with three examples drawn from real AI deployments — each a failure that no technical fix could have prevented.

A maternal health app launched across several communities achieved near-zero adoption — not because the technology failed, but because the design team did not know that in those communities, a woman’s health decisions involve her family first.

A credit algorithm penalized women for gaps in transaction history, reading shared phone usage among family members as financial unreliability rather than a common social practice. And community adoption of new tools, she argued, is built on years of trust and presence — something no engineering sprint can manufacture.

“The people who would have caught that weren’t in the room,” she said of each failure. The refrain was deliberate.

AI is already making decisions that affect you — who gets hired, who gets a loan, who gets healthcare, whose voice gets heard. If you are not in the room where these systems are built, someone else is making those decisions on your behalf.

Emily Fiagbedzi

Where the Bias Lives

Fiagbedzi was direct about the stakes. AI systems are not neutral tools — they carry the biases of the data they were trained on and the blind spots of the teams that built them. She catalogued four domains where this plays out in ways that disproportionately affect women:

How AI Systems Are Already Failing Women

  • HiringCV screening tools reflect who got hired historically. Historically, that wasn’t women. The algorithm repeats it.
  • Credit & Lending – Informal traders — many of them women — hold fewer formal financial records. Algorithms read that as risk, not reality.
  • Recognition Tools – Facial recognition has documented error rates significantly higher for women with darker skin tones, deployed in hospitals, banks, and border controls worldwide.
  • AI-Generated Advice – Large language models default to Western norms when answering health, legal, and financial questions. For many women globally, the model’s “normal” is not their normal.

“Every one of these failures has the same root cause,” she said. “The right people weren’t in the room. You are not a nice-to-have. You are the corrective.”

Three Keys That Open the Door

Rather than treating AI as a monolith requiring a computer science degree to enter, Fiagbedzi mapped the full lifecycle of an AI product — from defining the problem, to gathering data, building and testing, deploying within communities, and governing the system — and pointed out where code is actually required.

Building AI is 20% code. 80% everything else.

The implication was clear: the vast majority of the work that makes AI trustworthy, fair, and effective is non-technical. And the roles that perform that work are real, open, and hiring now.

A Lesson They Lived

One of the most candid moments in the talk came when Fiagbedzi described her own organisation’s experience. They had been intentional about outreach. They directly invited women to apply. And when they rebranded their programme as an “AI Startup Program,” applications from women still dropped.

The word ‘AI’ itself carried a signal that no invitation could immediately override,” she said. The lesson she drew from it shifted the responsibility: “The question is not: how do we get women to be bolder? The question is: how do we change what that word signals?”

Reframing What “AI Work” Actually Means

Fiagbedzi closed by naming the dismissals plainly — and replacing them.

The Polite DismissalThe Real Picture
AI is for engineersAI needs domain experts, ethicists, storytellers
You need to code to contributeYou need to understand people, problems & power
Tech spaces are for techiesTech spaces need lawyers, health workers, educators, traders
Women without technical degrees are welcome to use AI productsWomen without technical degrees are building what those products need to work

Her call to action was practical. For individuals: name the three problems in your sector that AI should solve — that list is your entry point. Find one team that needs you and send one email this week. And prepare the response when someone tells you AI isn’t your lane: “AI will affect everything in my lane. That’s exactly why I need to shape it.”

For the organisations in the room, the asks were sharper: audit your language, hire and fund non-technical roles explicitly and equally, and put women in the room as decision-makers from the start — not as advisors brought in after the decisions are already made.

“The AI systems being built right now will determine who gets hired, who gets loans, who gets healthcare for the next generation. The future of AI will be built by women who know their communities, who ask the right questions, and who refuse to wait for permission. That woman is in this room.”


Learn more about Tech Labari’s Women and AI Event


TAGGED:
Joseph-Albert Kuuire is the creator, editor, and journalist at Tech Labari. Email: joseph@techlabari.com Twitter: @jakuuire