Your browser's privacy settings appear to be blocking this content from being displayed. Please review your privacy and tracking protection settings to enable this service. For more information, visit:
Select a country.
Select your country to follow your local MEPs' news:
Selected language: English
What are you looking for?
10.03.2026
Letting AI strip women is a new form of gender-based violence
It takes seconds to destroy a woman’s dignity online. Not because she chose to share something intimate. Not because she trusted the wrong person. But because someone decided to type a prompt into an artificial intelligence tool. An ordinary photo. Fully clothed. Taken from social media. In mere seconds, an algorithm strips it, sexualises it, and distributes it to hundreds.
There is no consent. No warning. No way to pull it back once it spreads. I know because I have gone through it, just like millions of other women. In just nine days, the Grok chatbot created and posted 4.4 million images, of which nearly one in two were sexualised images of women, highlighting the pervasive nature of this violation.
One woman recently shared that she felt “dehumanised and reduced to a sexual stereotype” after Grok digitally removed her clothes. She said, “It looked like me, and it felt like me, and it felt as violating as if someone had actually posted a nude picture of me.” Even if the image is fake, the violation is real.
Anyone can be a victim. Women who speak out and hold positions of responsibility, such as mayors, CEOs, and NGO leaders. Women who are visible and express their opinions. All women can be targeted because technology did not create hate; it just learned to automate it.
This is not simply a side effect of new technology. It is sexual exploitation, blackmail, and violence against women. Some call it digital rape, and if that word makes you uncomfortable, that is the point. Human dignity can be destroyed without anyone ever touching a body.
The harm does not stop with the image. Women become silent and withdraw, hesitating before posting, speaking, or running for office. Young girls learn early that being seen can lead to punishment. When fear drives women out of public spaces, our society does not just bend, it cracks. Thousands of women leave social platforms or abandon leadership roles annually due to harassment. This not only silences individual voices but also deprives our community of varied viewpoints and potential leaders.
No, artificial intelligence is not the one to blame. That excuse is easy but not true. Algorithms do not work alone. Platforms make money from this. Grok reportedly made $88 million in the third quarter of 2025 and could make nearly $300 million this year from subscriptions and new models, continuing to feed this cycle of abuse. Abuse spreads because safeguards are weak, responses are slow, and responsibility keeps getting passed around. Europe has digital rules, but if they are not enforced, they are just for show. We have let systems move faster than justice, and women are left to pay the price.
Even xAI’s own policy says people cannot be shown “in a pornographic manner” without consent. Still, these images keep circulating for months. Experts have warned that platforms would stop this abuse if they wanted to. The problem is not that there are no rules. The problem is that they are not properly enforced.
Europe is not powerless unless it decides to be. We have the Digital Services Act, the Artificial Intelligence Act, and other laws, such as the recent directive combating violence against women. We know deepfakes are spreading fast. We know online sexual exploitation is rising, and children are especially at risk. Platforms are not just bystanders; they shape the space and profit from it. When illegal content appears, taking immediate action is not optional - it is our duty.
It is already illegal to create or share non-consensual intimate images, including AI-generated sexual deepfakes. The real question is not whether the law exists, but whether we use it.
Too often, action only happens after public outrage. Content is taken down after the harm is done. But removing an image does not erase humiliation, bring back safety, or take away fear. Prevention must come before the damage, not after, when it is already too late.
International Women’s Day (8 March) forces us to face this reality. If women’s rights end at the digital border, then equality is only conditional. If consent can be negotiated online, then dignity can be too. Women’s rights do not vanish just because abusers use new technology.
We have a choice. We can demand that illegal deepfake material is removed right away and that platforms face real consequences, or we can admit that the digital future still treats women’s bodies as collateral damage. Either Europe leads on human dignity, or it quietly accepts a system that feeds on humiliation.
We cannot accept that kind of future.
Staying neutral is no longer an option. When action is delayed, abusers win. When platforms hesitate, harm grows. When we look away, violence becomes normal. This International Women’s Day, Europe must choose courage over comfort. Let’s act now!
Note to editors
The EPP Group is the largest political group in the European Parliament with 187 Members from all EU Member States
Committee Coordinator
Press Officer for Petitions, Gender Equality, Home Affairs and for Lithuania
6 / 54