INSIGHT | INTERNATIONAL DAY TO END IMPUNITY FOR CRIMES AGAINST JOURNALISTS
AI is supercharging abuse against women journalists – but it doesn’t have to be that way
2 November 2025
Online gender-based violence is causing women journalists to self-censor and withdraw from public spaces, writes the BBC World Service’s Gender and Identity Correspondent, and AI is throwing fuel onto the fire. But can AI itself be part of the solution?

The theme for this year’s International Day to End Impunity for Crimes Against Journalists is Chat GBV: Raising Awareness on AI-facilitated Gender-based Violence against Women Journalists.
By Megha Mohan, BBC World Service Gender and Identity Correspondent
When the BBC World Service announced in 2018 that it would create a role dedicated to reporting gender and identity across our language services, and that I would be taking it on, I had no idea what kind of gendered scrutiny would follow.
I expected some criticism, of course, but nothing could have prepared me for how personal it would become. Soon, I was fielding intrusive questions about my gender and sexuality, and facing waves of online trolling every time a story went live. At one point, the harassment grew so severe that a Twitter employee reached out to ask if I needed support.
The trolling was intended to shame and silence me – a woman journalist reporting on deeply sensitive issues in regions where minority voices were rarely amplified. For example, only 5% of articles about conflict and war between 2013 and 2023 focused on women’s experiences, according to Care International.
It was hard when my role was created, but if my role were announced today, I fear the level of harassment would be even more alarming.
Now, abusers have new tools, new weapons at their disposal. Artificial Intelligence (AI), the astonishing technology transforming industries, communication, and creativity, has also changed what it means to be a woman journalist online.
Whether it comes from one person or an organised community, a single malicious prompt can now unleash a storm of deepfake videos or synthetic voices posing as podcasters. It is a form of AI-facilitated gender-based violence.
Around the world, female anchors have seen their faces grafted onto explicit videos that then spread through largely untraceable messaging apps faster than any fact-check could reach. This happened to one journalist I know, and the fabricated video was shared so widely, that it reached a group her father was in.
In other cases, journalists exposing corruption have been impersonated by AI-generated avatars that claim previous stories the reporters had published had been fabricated.
When women journalists censor themselves, entire areas of public life that only we have access to because under-represented communities trust us, go underreported. … The silencing of women in journalism becomes the silencing of women everywhere.
The AI-generated hate that can follow isn’t a glitch in an artificial system; it’s the system learning from humans – from decades of ineffectual online regulation of gendered abuse, now codified in datasets. As AI tools scrape content from the internet, they absorb the biases and vitriol that have always been directed at women in public life, and then they reproduce it at scale.
More on AI and Public Service Media
Roundtable: Responsible AI in International Public Media
16th December 2025
How Mediacorp maintains a close relationship with its audience
10th December 2025
The gendered architecture of algorithms
What makes this moment particularly perilous is how invisible the bias can be. Algorithms themselves aren’t sexist; they just amplify what’s already loudest.
A recent Council of Europe report warned that AI systems can “spread and reinforce existing stereotypes, thus perpetuating discrimination and sexism”. It says that AI language models, image generators, and recommendation systems routinely replicate gender bias, sexualise women’s images, and undervalue female voices in datasets.
For women journalists, whose work often challenges social hierarchies, this becomes twice as dangerous.
Take social-media algorithms that determine whose stories appear in your feed. When reporting on women’s rights or LGBT issues, I’ve noticed how certain topics, particularly those challenging patriarchal norms, attract far more abuse.
Some algorithms appear to read that engagement as popularity, not hostility, and boost the posts further. In effect, the system is rewarding harassment.
It’s not just social media. AI moderation tools, designed to filter hate speech, can fail to detect the nuanced, gendered nature of abuse, if they are not trained to recognise it. A slur disguised as a “compliment” or a dog whistle can easily slip past detection. The result is an uneven battlefield, where women are left more vulnerable.
The cost

For years, I, and many women journalists were told to ignore the trolls. “Don’t feed them,” good-faith mentors and friends would say, as if silence would starve the abuse. In reality what happened was women disengaged online. I stopped regularly posting on public social media profiles a while ago, and I am not alone.
A study by the International Women’s Media Foundation found that 70% of women journalists had experienced some form of harassment, threat, or attack online, and around one-third had considered leaving journalism because of it. Marginalised women journalists especially are targeted, leading to self-censorship or withdrawal from online or public spaces.
When women journalists censor themselves, entire areas of public life that only we have access to because under-represented communities trust us, go underreported: domestic violence, reproductive rights, gendered poverty, climate-linked inequality. The silencing of women in journalism becomes the silencing of women everywhere.
The future
Solidarity among women has become one of our most powerful defences. In October 2025, the French government brought groups from across the world together at a conference to discuss feminist foreign policy, and how to actively bring more women to the table in geopolitics. Collective action, not isolation, makes resilience sustainable. The inclusion of more women creators in technological innovation was a priority repeated on panels.
AI itself is not the enemy. Like any tool, its moral weight depends on how we wield it. In the same way that AI can be used to mimic voices, it can also be used to trace disinformation, authenticate content, and detect deepfakes before they spread.
It was hard when my role was created, but if my role were announced today, I fear the level of harassment would be even more alarming. Now, abusers have new tools, new weapons at their disposal. Artificial Intelligence (AI) … has changed what it means to be a woman journalist online.
Apps are being created to help recognise and tackle gender-based violence. For example, WeLivedit.AI is being developed specifically for journalists who experience online abuse, mostly women. It enables women to create personalised filters that mute unwanted messages. Similarly, TRFilter, launched by Thomson Reuters Foundation, also uses machine learning to identify abusive or threatening content directed at an account. However, these AI tools don’t actually prevent or stop abuse, they only manage, detect, or mitigate its impact.
Latest Insights
If harnessed properly though, newsrooms could use AI to better protect their journalists, with real-time monitoring of harassment networks, and tools that automatically flag deepfake attempts, and algorithms that learn empathy rather than hostility. It is possible.
However, for that to happen, more women must be at the table where AI is designed. A World Economic Forum report from 2018 says that only 22% of AI professionals globally are women – a figure that is still widely cited.
We bring perspectives – shaped by lived experience of bias, harassment, and resilience – that can make these systems safer for everyone. The future of AI ethics should not be discussed without the women it currently endangers.
The right to speak without fear
Ultimately, this is about more than technology. It’s about who gets to speak in public without fear.
AI has given misogyny a new vocabulary, but it can also help us write a different story, one where women journalists aren’t forced offline, where credibility isn’t questioned by algorithms trained on bias, where truth-telling isn’t punished with synthetic violence.
The purpose of journalism is to hold power to account, but that mission becomes harder to uphold when the women doing the work must also defend their own identities from digital distortion.
So the next time you read a woman journalist’s byline, remember what might lie behind it: long nights spent reporting, carrying the stories of survivors and victims of abuse, and days spent confronting harassment aimed at themselves. It’s exhausting work, but it is essential work that can and must be robustly defended.
When women journalists can report freely, the news you read reflects reality. It becomes a little more honest, and in an age shaped by artificial intelligence, that honesty offers something we all crave: a sense of trust, clarity, and hope.
About the author

Megha Mohan is the BBC’s first global gender and identity correspondent. She covers issues concerning women’s rights, LGBT communities, race and ethnicity, for the BBC’s 43 language services worldwide. She has reported on the black market for abortion pills in Honduras, femicide in Russia, and gender roles in North Korea’s army – a story that ranked among Chartbeat’s 100 most-read articles in the world.
Mohan was named in Progress 1000’s list of most influential storytellers, and she consulted on Level Up’s media guidelines on how to sensitively report on domestic abuse. She is also the co-founder of Second Source, a network of women journalists from under-represented backgrounds
Related Posts
19th September 2025
Trust and Tech: Where should RNZ draw the line on AI?
Artificial intelligence is everywhere,…
13th August 2025
How public media organisations use AI: Industry Report
The first of two industry reports…
7th December 2023
OUT NOW | PMA’s gender-sensitive action plans for SEA media stakeholders
Action plans designed to boost…








