Women and AI: Vanessa Kofinti Argues That Ghana’s AI Ambitions Could Leave Women Behind

Vanessa Kofiniti, in a detailed presentation at Tech Labari's Women and AI event, revealed that 80% of educated Ghanaians don’t even know their country has a national AI strategy

6 Min Read

Ghana is moving fast on artificial intelligence. The country’s National AI Strategy 2025–2035 positions AI as a core engine of growth, targeting healthcare, agriculture, financial services, and public administration.

Projections suggest AI could contribute up to 500 billion Ghana cedis to GDP by 2035. That’s an ambitious number — and an important one. But ambition, on its own, doesn’t guarantee fairness.

Vanessa Kofinti, a legal officer at FIDO, who conducted a presentation at the Tech Labari Women and AI event on 13th March 2026, asks an uncomfortable question: who exactly is Ghana’s AI future being built for?

Most Ghanaians Are Using AI Without Knowing It

Kofinti, in her presentation, stated that a survey of 102 educated Ghanaians between the ages of 25 and 34 — half of whom held postgraduate degrees.

The findings are striking. 54% use generative AI tools like ChatGPT, Gemini, or Claude every single day. Yet 80% had never heard of Ghana’s National AI Strategy.

Their average self-rated AI knowledge score? 2.57 out of 10. Their average awareness of their own privacy rights when using these tools? 2.76 out of 10.

In other words: millions of Ghanaians are already living inside an AI-powered world, but most have little understanding of how it works, who controls it, or what rights they hold within it.

The Gender Data Desert

Here is where the problem gets structural. AI systems learn from data. And across Africa, the data being used to train these systems is deeply incomplete — especially when it comes to women.

Research found that only 17% of African countries report gender-disaggregated data on internet use. Only 11% track mobile phone ownership by gender. And a mere 6% report gender data on computer programming skills or engineering and technology researchers.

The consequences are not hypothetical. When AI systems are trained on data that doesn’t represent women, they produce outcomes that harm women. A health AI tool trained mostly on male clinical data performs worse for female patients.

A credit-scoring algorithm built on urban transaction data will disadvantage rural women traders — the exact use case being explored in Ghana’s fintech sector. A language model trained on English text simply will not work for Ghanaians who communicate primarily in Twi, Ga, or Hausa.

Ghana’s own statistics make this urgency clear.

According to the Ghana Statistical Service (2021), 4.6 million women in Ghana are illiterate, compared to 3.3 million men.

As of 2022/2023, 15% of primary schools had no functioning ICT facilities. The girls who go to those schools are not just behind — they are being actively excluded from systems that are being built, as Kofinti puts it, “in their name and, increasingly, about them.”

The Ethics Question Nobody Is Asking

Much of the global conversation about AI ethics has been led by institutions in Europe and North America. Frameworks like the EU’s Guidelines for Trustworthy AI and UNESCO’s Recommendations are widely cited.

But they were also developed primarily in contexts with strong regulatory enforcement, high public awareness of digital rights, and relatively well-funded institutions to implement them.

Ghana is a different environment. The EU’s non-binding, “soft law” approach to AI governance may simply produce frameworks that look good on paper but deliver little real-world protection — particularly for women and other vulnerable groups.

The study calls for Ghana to resist copying Western frameworks wholesale and instead build regulations that reflect local realities, local power dynamics, and local enforcement capacity.

There is also the question of data ownership. Africa is rapidly becoming a net exporter of raw data and a net importer of AI tools built on that same data — with ownership concentrated in a handful of US and Chinese platforms.

Kofinti’s presentation showed that 46% of respondents believed individuals should control their own data, yet public understanding of how data economics actually works remains very low.

What Needs to Change

The study makes several concrete recommendations for policymakers and builders alike.

On the policy side: gender-disaggregated data should be mandated across all national AI statistics. Regular bias audits, covering both gender and linguistic disparities, should be built into the National AI Strategy 2025–2035.

The Data Protection Commission should be upgraded to a full Authority with real enforcement powers — not just advisory teeth.

For developers and product builders: partner with local language and AI research initiatives like Ghana NLP, Wiki in Africa, and the KNUST Responsible AI Lab. Build consent mechanisms and data rights disclosures that work for users with low literacy.

And ask, from the very beginning of the design process: how could this product be used to harm a woman?

For the general public, the ask is simpler but no less important: get curious. Read about your data rights. Ask what happens to your information when you use a free app. Engage with AI literacy programmes. As the research notes, the desire to learn already exists — the ecosystem just needs to meet people where they are.


Stories published using AI will be attributed to this AI generator author
Joseph-Albert Kuuire is the creator, editor, and journalist at Tech Labari. Email: joseph@techlabari.com Twitter: @jakuuire