How AI is supercharging digital Gender-Based Violence crisis in Kenya

National
By Jacinta Mutura | Dec 02, 2025

Brenda Oyiko Awinja, a victim of Technology-Facilitated Gender-Based Violence during the interview in Nairobi, on November 30, 2025. [Collins Oduor, Standard]

It began with a single message, an unexpected ping. Then another followed, and another, until the notifications no longer came quietly to her phone but spilled into her WhatsApp groups, professional circles, and finally into the hands of people she loved.

By the time her family was calling in confusion and worry, the online world had already erupted with a deep fake video bearing her face. A face she recognised and a body she did not.

For Demaris Aswa, a human rights defender from Kilifi, that was the moment her life got fractured.

The video itself was designed to wound. It showed a woman with facial features similar to hers. The woman was dressed only in a bra and a small pair of underwear, moving suggestively as though performing for an unseen audience.

Aswa says the room did not resemble any space she had ever lived in, worked from, or even visited. She keenly looked at the clip but none of the features in the room was anything familiar. It was an artificially generated fake video to malign her.

“In that video, I was in a room, dancing and showcasing my body. And I can’t even remember doing such a thing. I know people will judge and say, maybe I was with my boyfriend, but I wasn’t,” Aswa narrated.

“I was so shocked I tried to figure out where that was in particular. I was wondering if it was a hotel because my house doesn’t look like that. My parents’ house doesn’t look like it. That was not me, but it had my face,” she said.

Aswa’s ordeal is an example of the ever-growing crisis of Technology-Facilitated Gender-Based Violence (TF-GBV) - a technologically facilitated assault intended to destroy people’s reputation, credibility, safety and work.

Damaris Aswa, a victim of Technology-Facilitated Gender-Based Violence during the interview in Nairobi, on November 30, 2025. [Collins Oduor, Standard]

What followed was a traumatic experience of helplessness that Aswa said almost cost her her life. The video reached her colleagues, partners, her professional networks, and eventually the people she cares about the most, her family.

Calls started coming in, messages, emails of people who had seen the video flooded her phone.

For days, Aswa exhaustingly repeated the same phrase “It is not me” pleading for her own reality to be believed. She desperately fought trying to convince anyone who had come across the video that it was a fake Artificial Intelligence (AI)-generated video.

“I felt like I was crashing. That was the hardest moment of my life,” she recalled.

Every share, like or comment on the video, fueled the lie and the spiraling violence targeted on her faster than her own truth ever could.

“Imagine receiving calls and emails from all your networks and your organisation because of a fake video about you, and just telling people that this is not me. It is my face, yes, but this is not me. Just telling people that it was not me was really hard. I was hurt and depressed,” she recalled.

For Aswa, the deepest blow came from the institutions meant to safeguard her. When she first walked into the police station to report the attack, what she found was victimization, ridicule and indifference instead of help she desperately needed.

One officer brushed her off, telling her she was not the only one with “such issues,” suggesting that it was self-inflicted.

“I remember the lady I found there (police station) scolded me,  saying: madam you keep sending those things and when it goes viral you come here to bother us.” 

Despite pleading her innocence, Aswa was kicked out of the station. 

In another visit to the station to track the progress of the matter, Aswa was allegedly told to ‘give something small’, a veiled demand for a bribe to facilitate her case.

She went online to seek help from organisations, but even that failed. She scrolled through the toll-free numbers, including those advertised as 24-hour support, desperate for anyone who could listen, but even that failed. At that moment, she sank to loneliness and desperation.

Experts suggest that isolation is what makes TF-GBV different from other forms of violence such that, as it multiplies itself through the speed of technology, the survivors are crowded into silence, pushed offline and eventually pushed into an emotional collapse.

John Gathii, digital resilience fellow at KICTANeT, during the interview in Nairobi, on November 30, 2025. [Collins Oduor, Standard]

Aswa was almost swallowed by that spiral. The only lifeline was her family, particularly her parents who refused to let her disappear.

They insisted she attend a global convention she had been invited to, even when she wanted nothing more than to retreat from the world.

“I wanted to text them and tell them I could not make it to the event. But I remember my parents telling me, you will go, because if you don’t go, then the people who did this will feel that they have won. And you should not let them win,” she recalled.

“My parents would tell me, we understand that the work you are doing is very risky, and you are being targeted for this. And the fact that you are a woman, this technologically facilitated GBV is very high,” Aswa narrated.

Two years later, she still carries the Occurrence Book number, under which her case was recorded as proof of justice that never came. 

“It was the hardest moment of my life, and anyone who has gone through this will actually agree with me. If you don’t have that support, or a community of people who can create safe space for you, you can easily succumb to worse things, like even committing suicide.

“I was targeted because of the work I do, including anti-corruption, social justice, democracy campaigns. And because I am a woman, and unmarried, it was easy for them to weaponise that. My family came through for me. Otherwise, I would have actually succumbed to suicide,” she reflected.

Her experience is not an isolated case of online violence and bullying. It is a window into the expanding shadow of TF-GBV in Kenya where deep fakes, doxxing, non-consensual image circulation, and coordinated online harassment have become powerful tools for silencing people, particularly women.

As the world marks the 16 Days of Activism to End Gender-Based Violence from November 25 to December 10 under the theme Unite to End Digital Violence against Women and Girls, TF-GBV remains an escalating serious human rights violation affecting millions globally, particularly women and girls

United Nations Population Fund (UNFPA)- the UN’s sexual and reproductive health agency, lists various forms TF-GBV, including doxing, cyberstalking, cybermob, cyberbullying, image-based abuse, online harassment, image-based sexual abuse, shallow fakes, sextortion, online impersonation and gendered disinformation.

With the emergence and rapid growth of the use of AI tools, TF-GBV has become more amplified.

“TF-GBV has become prevalent and, it’s mostly targeted at girls and women. We find that AI makes it even easier for the perpetrators to make it even more reachable to more vulnerable women and girls. Because with AI, then you have deepfakes,” said John Gathii, digital resilience fellow at KICTANeT.

“A bad actor can place a face on just an innocent girl and perpetrate to be doing something at the wrong time. Then it becomes an online harassment,” he added.

UN describes image-based sexual abuse as a form of TF-BGV that involves taking, sharing or threatening to share sexually explicit images without consent. It includes non-consensual sharing of intimate images commonly referred to as “revenge porn”, upskirting, AI-generate sexual imagery or deep fakes and cyberflashing.

Sextortion has also been classified as one of the common forms of abuse in the digital era, whereby someone uses digital means to blackmail another person by demanding for money, sex or sex acts, or additional explicit images in exchange for not exposing intimate images or private information.

According to UNFPA, image –based abuse refers to the use of imagery, often sexual in nature, to objectify, exploit, humiliate or harass, such as non-consensual sharing of intimate imagery, also known as non-consensual porn and child sexual abuse materials where a child is shown in sexually explicit situations.

Doxxing is described as a form of digital violence where one posts personal and sensitive information, including home and work addresses, telephone numbers, email addresses and family names without permission.

Brenda Awinja, a TF-GBV survivor from Kakamega County, encountered violence on digital spaces when she was only 18 years old and in college.

She was a dancer at the time and she would occasionally go out with friends and have fun and post pictures online.

And the perpetrator was someone who kept making advances to her for an intimate relationship but she always declined, to the chagrin of the man. 

“I was young and exploring the world. So I could post pictures of me in a bikini out there. One time he proposed for us to be in a relationship and I declined. And the aftermath was not okay. He started sending me nude pictures of him masturbating,” Awinja narrated. 

Share this story
.
RECOMMENDED NEWS