AI at the Service of Abuse: Shame Belongs to the Perpetrator

Artificial intelligence is driving a new wave of violence against women: all over the world, and increasingly in Southeast Europe, women are being targeted with sexualized deepfakes and AI-generated images. This abuse remains largely hidden due to stigma, low awareness, and weak institutional response.

By: Maida Salkanović

Across the world, individuals, companies, and governments are racing to harness artificial intelligence to cut costs, boost efficiency, and drive innovation. Yet the same tools are also being weaponized: powering scams, accelerating the spread of disinformation, and creating ever more convincing fakes. One of the most pervasive misuses of AI is the creation of sexualized deepfake videos and AI-generated images of women.

Research by the deepfake-tracking company Sensity AI shows a consistent pattern: since 2018, between 90 and 95 percent of all deepfakes have been non-consensual pornography, and around 90 percent feature women. This digitalized form of violence is not confined to private harassment but it is increasingly deployed against public figures and used to discredit female politicians.

Altered images of Italian female politicians, including Prime Minister Giorgia Meloni, surfaced on a porn site in August 2025. In the United States, a study identified more than 35,000 deepfake images of members of Congress, with women found to be 70 times more likely to be targeted than their male colleagues. While Southeast Europe has not yet seen such high-profile cases, the incidents that have emerged suggest the problem is already present, just less visible.

In several countries across the region, cases of sexualized AI-modified imagery have already come to light through the work of journalists and activists. In Bosnia and Herzegovina, police uncovered AI-generated child abuse material, leading to the arrest of the perpetrator. The perpetrator was part of a worldwide network that subscribed to AI-generated images produced by a Danish citizen. According to a BIRN report, other cases in Bosnia have involved adult women, and incidents of AI-generated sexual imagery have been reported to the police and to a helpline in Banja Luka. In Serbia, journalists uncovered large Telegram groups with tens of thousands of men exchanging nude, digitally altered photographs of women. Meanwhile, in Kosovo, a young woman’s image was manipulated into sexually explicit content and circulated on TikTok, where she was labeled as a “whore.” This resulted in multiple messages from different users asking her for more sexual content. These examples demonstrate that although AI-driven image abuse has not yet entered mainstream public debate, it is already happening in private spaces, and we only hear about it when it is exposed by the police, journalists, or activists.

SEE Check spoke with six organizations dedicated to women’s rights, including in the digital sphere, across Bosnia-Herzegovina, Serbia, Croatia, and Montenegro, to learn about their experiences and whether such cases appear in their work. Only one, Serbia’s Autonomous Women’s Center (AŽC), reported that women had occasionally approached them about AI-generated sexualized content. Even then, such cases remain rare compared to the far more common problem of intimate images shared without consent.

“We have not often had cases of women turning to us because of artificially generated content. Far more frequently they reach out over real intimate material shared without their consent. Still, such cases are on the rise,” said Sanja Pavlović of the Autonomous Women’s Center.

In Croatia, Dajana Pul Bošnjaković from the organization BABE noted that their center has dealt with many forms of abuse, from perpetrators ranging from employers to spouses, and involving surveillance, hacking, and non-consensual image sharing—yet, so far, no reports of AI-generated content.

In Bosnia, Denija Hidić from the Sarajevo-based CURE Foundation explained that her team had expected such cases to emerge during the last local elections, given that political contexts often provide fertile ground for this type of abuse. “But that did not happen,” she said. The Zagreb-based Ženska soba and Sarajevo Open Center (SOC) also confirmed they have not yet encountered cases involving AI-generated images.

Ana Jaredić from Montenegro’s Women’s Rights Center recalled a case that could potentially involve AI. A woman from a smaller town sought help after a neighbor threatened to publish sexualized images of her. Because she had never been in any kind of relationship with him and did not believe he could have obtained such material, she suspected the images might be AI-generated.

“It was a small community, and in the end she decided to resolve it quietly, through an informal conversation,” Jaredić explained. “She did not want to expose herself by going to the police, because she lacked confidence that the case would remain confidential, and she feared that people would actually believe the images were real.”

Silenced by Stigma

Experts interviewed by SEE Check emphasize that the reasons women rarely report such cases are rooted primarily in cultural and social dynamics.

“We live in an unjust society and an unjust system, where victims are often condemned instead of perpetrators. There is still a widespread prejudice that being a victim is something shameful,” said Bosnian sociologist Vladimir Vasić.

He explained that when women are subjected to any form of violence, they remain deeply concerned about how their surroundings will react. “Perpetrators exploit precisely that vulnerability,” he said. “They count on the fear of stigma, when perpetrating those crimes that could have devastating personal consequences.”

Sanja Pavlović of the Autonomous Women’s Center noted that the dynamics of this type of violence often leave women feeling responsible for the abuse they endure. “As in many cases of violence they experience, women feel a kind of guilt simply for experiencing it. The abuser convinces her that she somehow contributed to the violence,” she said. “Many women hope that if they do not act, the perpetrator might not share the material, or that they can negotiate with him privately.”

Another obstacle, experts warn, is the lack of awareness and knowledge about AI itself. “Bosnian society is still not AI literate, and that is directly tied to the generally low level of digital literacy,” said Denija Hidić. “People may not even recognize that certain content was generated with artificial intelligence, which means cases go unreported or never reach the public.” She added that in Bosnia, it is likely no one has yet had sufficient interest, or intent, to deliberately weaponize such material, but stressed that the risk is already present.

“Still, we are convinced that such material already exists ‘in the cloud,’ stored and waiting to be used at the moment it best serves someone’s interest,” Hidić warned.

That concern is echoed in Montenegro, where lawyer and human rights advocate Anja Vučinić highlighted the broader lack of awareness. “Our population is poorly informed and largely uneducated about what AI actually is, what artificial intelligence means, what generative AI means. This educational barrier creates a significant problem when it comes to reporting such cases,” she said.

AI and the Law: Lagging Behind

Even though not all Southeast European countries explicitly mention AI in their criminal codes, existing legal frameworks still provide avenues for protection. According to Vučinić, skilled judges and lawyers can rely on international conventions that states have ratified. “It is not that you lack protection just because the law does not contain a specific provision,” she explained. “You do have it, you just need a good lawyer who understands that the constitution allows you to apply international law locally.”

Some countries in the region have already taken steps to update their legislation. When AI technologies first began to emerge, Amina Dizdar from the Sarajevo Open Center (SOC) warned colleagues about their potential harm to women. “If we already have videos of cats in space, it’s only a matter of time before someone applies the same technology to sexual abuse,” she said at the time.

That prediction has since become reality. In response, SOC submitted a proposal for an AI-specific article to be added to the new amendments to the criminal code of the Federation of Bosnia and Herzegovina, the country’s Bosniak- and Croat-majority entity. Their submission was among several from civil society organizations, and the AI clause was ultimately adopted in July 2025. The Serb-majority entity of Republika Srpska had already introduced a clause on AI in August 2023.

Bosnia followed the example of Croatia, which had already incorporated AI into its criminal code. As an EU member, Croatia is also bound by legislation such as the AI Act, although that framework is focused more on regulating services than protecting human rights. Croatia and Montenegro have additionally signed the Council of Europe’s Framework Convention on Artificial Intelligence, which places greater emphasis on the human rights implications of AI.

The very nature of the online environment also creates obstacles. Rebecca White of Amnesty International noted that this digital violence differs from other forms precisely because of the ease with which it crosses borders and persists over time. “You don’t even have to be in the same country,” she explained. “Harmful content like images, videos and online smear campaigns can be shared easily and remain online long after the initial ‘post’ – and potentially forever.”

This lack of boundaries is not abstract. In Montenegro, Ana Jaredić recalled a case where the perpetrator was based abroad, leaving local authorities unable to intervene. The result, she said, was a victim left exposed, with no clear path to justice.

Prevention, Awareness, Accountability

Despite the existence of legal protections, another major barrier lies in how institutions respond. The level of awareness and sensitivity among police, prosecutors, and judges remains uneven. “Another problem that could arise is the unprofessional, and ultimately inhumane, attitude of those to whom women report abuse,” said sociologist Vladimir Vasić. “How will the police, the prosecutor’s office, or any institution respond? Will the victim be met with understanding, or will she face condemnation instead? That, truly, is a disgrace.”

Lawyer Anja Vučinić pointed out that while there are many training programs for judges, prosecutors, and police on this type of violence, the system is still far from sensitized. “Secondary victimization happens frequently, and that discourages other potential victims from coming forward,” she said.

Amina Dizdar of the Sarajevo Open Center stressed that much more can be done on the prevention side. This includes training for police officers and prosecutors, investing in technology that would allow institutions to keep pace with new digital threats, and running awareness campaigns aimed at removing stigma from victims and encouraging them to report crimes without fear of prejudice or judgment. “Without that, we cannot expect to see a greater number of reports,” she said, adding that responsibility also lies with the companies managing the platforms used to distribute such material.

Rebecca White emphasized that tackling this sort of violence requires both resources and long-term vision, but that governments can begin investing immediately if they are serious about addressing the problem. Efforts should go beyond legislation and policy, she argued, to include educational programs that raise awareness and challenge discriminatory attitudes. “It’s not so much about how prepared they are, it’s about whether the will is there to do something about it,” she told SEE Check.

Sociologist Vladimir Vasić echoed this point, noting that societies must evolve alongside technological change. “As technology advances, so must our response to the realities we live in,” he said. “Shame should fall on the perpetrator of the crime, not on the person who suffers it.”

Source HERE!

Također vam se može svidjeti…