Massachusetts Daily Collegian

A free and responsible press serving the UMass community since 1890

A free and responsible press serving the UMass community since 1890

Massachusetts Daily Collegian

A free and responsible press serving the UMass community since 1890

Massachusetts Daily Collegian

Sexual harassment in the age of AI

Artificial intelligence and immersive virtual reality spaces are enabling predatory actions
Sexual+harassment+in+the+age+of+AI
Xiaoxiao Hu

In January, the news was saturated with stories about the spread of sexually explicit images depicting Taylor Swift. These images were generated by artificial intelligence, also known as AI. While these stories have brought this phenomenon into the public consciousness at a large scale, the issue of sexually explicit AI-generated imagery is neither new nor isolated to the rich and famous. In truth, anyone can be the victim of an attack like the ones targeted at Swift, and the issue is becoming increasingly dire.

Egregious cases of AI-generated sexually explicit images have made the news around the world. In late September 2023, news broke that in a town in southern Spain over 20 girls, between the ages of 11 and 17, were victims of AI-powered sexual harassment. On March 12, 2021, a 14-year-old girl from London took her own life after boys from her school edited her face onto photos of adult pornography stars and used the images to bully her.

Even though cases of AI-assisted sexual harassment are widespread, the budding crisis is not well understood. For one, it is difficult to tell if the anonymity and ease of AI-generated explicit content have created more harassers. A phenomenon known as online disinhibition may contribute to more people using AI to harass others. Online disinhibition is the psychological state that makes individuals feel more relaxed and willing to engage in behaviors online they otherwise would not. John Suler, a professor of psychology at Rider University, isolates six factors that cause this psychological state – a sense of detachment from one’s actions and a feeling of invulnerability from consequences top his list.

Additionally, harassers may also believe that their online actions do not inflict as much harm as they would if they had been performed in person. Due to the virtual nature of the attack, the harasser does not have to see the victim’s visceral reaction. Being shielded from the impact of their actions limits the possibility of the harasser feeling guilty or taking responsibility for the trauma they inflicted.

They may believe that they are inflicting less or even no emotional harm on their victim due to the virtual nature of the attack. However, this is an uneducated belief, as it has been proven that the cognitive reaction experienced when being harassed in an immersive virtual reality (IVR) setting or by the use of AI is no different from the reaction one has when sexually harassed in the physical world. According to research on online harassment by Katherine Cross, one’s emotional response to harmful behavior directed at them in an IVR platform triggers the same internal nervous system and psychological responses as it would if the situation occurred in physical reality.

Online disinhibition leading to sexual harassment is especially prevalent in IVR spaces. Thirty-six percent of men and 49 percent of women have reported experiencing acts of sexual harassment in IVR. In 2021, Nina Jane Patel opened up about her experience of being sexually harassed in a Meta IVR space. Within 60 seconds of logging on, three-to-four male avatars surrounded Patel and started verbally and sexually harassing her. When she tried to get away from her harassers, they started taunting her.

Patel recounts the incident as a bizarre nightmare that transpired so quickly that she was immobile before turning on the safety barrier. Based on Patel’s account, one can observe how she responded as if her event had occurred in physical reality.

Sexual harassment in IVR can have substantial, negative impacts on its victims, as evidenced by Patel’s experience and Cross’s research. The emotional effects of being sexually harassed in IVR carry into the victim’s personal life. A victim’s relationships, career and self-perception may all suffer because of being harassed. Given that it significantly negatively influences their life outside of IVR, the pain experienced by sexual harassment victims in IVR should not be dismissed.

It is imperative that society acknowledges the severity of AI and IVR-enabled sexual harassment and offers support and resources to victims. It is also the responsibility of programmers to do everything in their power to make their applications safe spaces for everyone. Safety features should be turned on by default, and it should not be up to the user to turn them on once they are already in an IVR space. Further, AI image generators should not be able to produce sexually explicit images. It would be naive to think that these actions would alleviate the problem. However, they would help lower the number of instances of sexual harassment in the virtual world.

Brigid Baleno can be reached at [email protected].

Leave a Comment
More to Discover

Comments (0)

All Massachusetts Daily Collegian Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *