I built this 'AI aunt' for women after family tragedy in South Africa

A graphic image of a young woman in a blue T-shirt with a multi-coloured headscarf. She is holding her hands in a heart shape. Image source, Grit
  • Published

A gruesome killing in her own family inspired South African Leonora Tima to create a digital platform where people, mostly women, can talk about and track abuse.

Leonora's relative was just 19 years old, and nine months pregnant, when she was killed, her body dumped on the side of a highway near Cape Town in 2020.

"I work in the development sector, so I've seen violence," Leonora says. "But what stood out for me was that my family member's violent death was seen as so normal in South African society.

"Her death wasn't published by any news outlet because the sheer volume of these cases in our country is such that it doesn't qualify as news."

The killer was never caught and what Leonora saw as the silent acceptance of a woman's violent death became the catalyst for her app, Gender Rights in Tech (Grit), which features a chatbot called Zuzi.

This is one of the first free AI tools made by African creators to tackle gender-based violence.

"This is an African solution co-designed with African communities," says Leonora.

The aim is to offer support and help gather evidence that could later be used in legal cases against abusers.

The initiative is gaining interest among international women's rights activists, although some caution that chatbots should not be used to replace human support, emphasising that survivors need empathy, understanding, and emotional connection that only a trained professional can provide.

Leonora and her small team visited communities in the townships around her home in Cape Town, speaking to residents about their experiences of abuse and the ways technology fits into their lives.

They asked more than 800 people how they used their phones and social media to talk about violence, and what stopped them from seeking help.

Leonora found that people wanted to talk about their abuse, but "they were wary of traditional routes like the police".

"Some women would post about it on Facebook and even tag their abuser, only to be served with defamation papers," she says.

She felt that existing systems were failing victims twice, first in failing to prevent the violence itself, and then again when victims tried to speak up.

With financial and technical support from Mozilla, the Gates Foundation, and the Patrick McGovern Foundation, Leonora and her team began developing Grit, a mobile app that could help people record, report and get a response to abuse while it was happening.

The app is free to use, though it requires mobile data to download it. Leonora's team says it has 13,000 users, and had about 10,000 requests for help in September.

At its core, Grit is built around three key features.

On the home screen is a large, circular help button. When pressed, it automatically starts recording 20 seconds of audio, capturing what's happening around the user. At the same time, it triggers an alert to a private rapid-response call centre - professional response companies are common in South Africa - where a trained operator calls the user.

If the caller needs immediate help, the response team either sends someone to the scene themselves or contacts an organisation local to the victim who can go to their aid.

The app was built with the needs of abuse survivors at its core, says Leonora: "We need to earn people's trust. These are communities that are often ignored. We are asking a lot from people when it comes to sharing data."

A woman sits at a table using a tablet during a digital training session. She is wearing an orange shirt and a patterned headscarf. In front of her are a pamphlet about using technology to stop gender-based violence, and other learning materials.Image source, Grit
Image caption,

Zuzi has been created with the help of women in communities around Cape Town

When asked whether the help feature has been misused, she admits there have been a few curious presses - people testing to see if it really works - but nothing she'd call abuse of the system.

"People are cautious. They're testing us as much as we're testing the tech," she says.

The second element of Grit is "the vault", which Leonora says is a secure digital space where users can store evidence of abuse, dated and encrypted, for possible use later in legal proceedings.

Photos, screenshots, and voice recordings can all be uploaded and saved privately, protecting crucial evidence from deletion or tampering.

"Sometimes women take photos of injuries or save threatening messages, but those can get lost or deleted," Leonora says. "The vault means that evidence isn't just sitting on a phone that could be taken away or destroyed."

This month, Grit will expand again with the launch of its third feature - Zuzi, an AI-powered chatbot designed to listen, advise, and guide users to local community support.

"We asked people: 'Should it be a woman? Should it be a man? Should it be a robot? Should it sound like a lawyer, a social worker, a journalist, or another authority figure?'" Leonora explains.

People told them that they wanted Zuzi to be "an aunt figure" - someone warm and trustworthy, who they could confide in without fear of judgment.

Two people smiling and posing together in front of a colourful background featuring artwork of a silhouetted woman.Image source, Grit
Image caption,

Leonora (R) and her colleagues at Grit say that their work gives women a sense of control over abuse

Although built primarily for women experiencing abuse, during the testing phase, Zuzi has also been used by men seeking help.

"Some conversations are from perpetrators, men asking Zuzi to teach them how to get help with their anger issues, which they often direct at their partners," Leonora explains. "There are also men who are victims of violence and have used Zuzi to talk more openly about their experience.

"People like talking to AI because they don't feel judged by it," she adds. "It's not a human."

UN Women reports that South Africa experiences some of the world's highest levels of gender-based violence (GBV), with a femicide rate that is five times higher than the global average. Between 2015 and 2020, an average of seven women were killed every day, according to South African police.

Many, including Lisa Vetten, a specialist in gender-based violence in South Africa, agree that it is inevitable that technology will play a role in addressing it.

But she also warns of caution around the use of AI in trauma-centred care.

"I call them Large Language Models, not artificial intelligence because they engage in linguistic analysis and prediction - nothing more," she says.

She can see how AI systems may be able to help, but knows of examples where other AI chatbots have given incorrect advice to women.

"I worry when they give women very confident answers to their legal problems," she says. "Chatbots can provide helpful information but they are incapable of dealing with complex, multi-faceted difficulties. Most importantly, they are not a substitute for human counselling. People who have been harmed need to be helped to trust and feel safe with other human beings."

A woman with short, dark, curly hair speaks at a conference. She is wearing a navy blue top with dangly earrings. A podium with two microphones is in front of her and a purple screen is behind her.  Image source, @judith.Litvine/MEAE
Image caption,

Lyric Thompson wants more women to be involved in developing AI

Grit's approach has drawn international attention.

In October, Leonora and her team presented their app at the Feminist Foreign Policy Conference hosted by the French government in Paris, where global leaders met to discuss how technology and policy can be used to build a more gender-equal world. At the conference, 31 countries signed a pledge to make tackling gender-based violence a key policy priority.

Conversations are buzzing around the use of AI, says Lyric Thompson, the founder and head of the Feminist Foreign Policy Collaborative, "but the moment you try to include gender in the conversation, to raise the dangers of racist, sexist and xenophobic bias being baked in, eyes glaze over and the conversation shifts - likely to a back corridor where there aren't any pesky women around to raise it".

Heather Hurlburt - an associate fellow at Chatham House, specialising in AI and its use in tech - agrees that AI "has enormous potential either to help identify and redress gender discrimination and gender-based violence, or to entrench misogyny and inequity", but adds which way we go is "very much up to us".

Leonora is clear that the success of AI to tackle gender-based violence depends not just on engineering, but on who gets to design technology in the first place.

A 2018 World Economic Forum report found that only 22% of AI professionals globally were women, a statistic that is still often cited.

"AI as we know it now has been built with historic data that centres the voices of men, and white men in particular," Leonora says.

"The answer is not only about having more women creators. We also need creators who are women of colour, more from the global south, and more from less privileged socio-economic backgrounds."

Only then, Leonora Tima concludes, can technology begin to represent the realities of those who use it.

You may also be interested in:

A woman looking at her mobile phone and the graphic BBC News AfricaImage source, Getty Images/BBC

Go to BBCAfrica.com, external for more news from the African continent.

Follow us on Twitter @BBCAfrica, external, on Facebook at BBC Africa, external or on Instagram at bbcafrica, external

BBC Africa podcasts