An explicit video of two individuals indulging in penetrative sex has gone viral on Instagram and X.
The authors of this story came across the reel while scrolling through Instagram last week. It shows two young Indian men having a drink. Initially, the video appears typical, casually shot, and not intended to promote any product or service. However, as the video progresses, it shows one of them, whose face is blurred throughout the clip, losing or partially losing consciousness and the other, whose face is visible, taking the incapacitated man to another room and engaging in penetrative sex. The video graphically displays their naked bodies and genitalia during the act, throughout which the ‘victim’ is unconscious or barely conscious. The clip includes multiple cuts, which suggests that it has been edited to resemble a typical pornographic video.
Here are a few screenshots from the initial part of the video:
According to Section 63 of the Bharatiya Nyay Sanhita, even consensual sexual intercourse is considered rape if the consent is taken when one is intoxicated or incapacitated. It is also to be noted at the outset that one can not confirm whether the clip in question is a staged video or whether it shows an actual commission of a rape. Upon inspection, however, we found that the man whose face was visible in the clip was a creator/adult entertainer holding an account on the subscription-based content-sharing platform ‘OnlyFans’.
What is of deep concern from the point of view of platform accountability is that the video, which flouts Instagram’s community guidelines on several counts, was live for at least six days amassing nearly 7 million views and 2.4 million shares. Even more worrying is that it was uploaded three times by the same user without being flagged by the automatic moderation algorithm. While these have been removed, the video continues to be live on Instagram, uploaded by other users.
The community guidelines of Instagram state, “We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don’t allow nudity on Instagram. This includes photos, videos and some digitally-created content that show sexual intercourse, genitals and close-ups of fully nude buttocks.”
We reported the video using Instagram’s in-app reporting mechanism and it took the platform six days to remove all three videos.
At the time of this article being written, the video has been uploaded for the fourth time by the same account and continues to be live. It has also been uploaded by several other users on Instagram, one of which has 6.7 million views. A new Instagram account which had only this video on its timeline amassed close to 6,800 followers.
We reported this particular video on July 25. It was no longer available at the time of this article being published, though we did not receive any notification from Instagram on the platform removing it. The same account subsequently shared another post which directed users to another account where the video was still live.
Even spoof videos and skits made on the basis of the problematic clip are now doing the rounds on social media platforms. Besides, screenshots from the video have made their way into memes and have gone viral.
Besides, several accounts have been created using the suffix ‘Zucc’ in the user name just to amplify the video. These accounts are being tagged in comments on other posts so that more viewers can access the video. The username is a clever wordplay. In meme culture, when a page is disabled or deleted, it’s referred to as being ‘zucced’, where ‘zucced’ is an allusion to Zuckerberg. This essentially implies that the page was taken down owing to Meta’s moderation mechanism. Accounts with such usernames are often backup pages for content that is likely to be removed by Meta.
In March 2019, Meta introduced a technology to detect nude images and videos automatically. It said in an announcement, “When someone’s intimate images are shared without their permission it can be devastating. To protect victims, it’s long been our policy to remove non-consensual intimate images (sometimes referred to as revenge porn) when they’re reported to us — and in recent years we’ve used photo-matching technology to keep them from being re-shared… By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram. This means we can find this content before anyone reports it…”
It is worrying to note that this particular video was not flagged for nudity at the server end when it was being uploaded multiple times.
According to Meta’s blog, it also uses a technology called SimSearchNet++, an image-matching model trained through self-supervised learning to match variations of an image with high precision and improved recall. This technology essentially helps in applying warning labels to duplicates of false claims and reduces their distribution. If Meta is using a similar technology for content moderation, the repeated upload of the video shows the ineffectiveness of that feature.
Also, Meta has introduced nudity protection in Instagram DMs, which blurs images identified as containing nudity and encourages people to think twice before sending nude pictures. The feature is designed to protect people from seeing unwanted nudity in their DMs and from scammers. The number of shares of the video in question suggests that nudity protection either doesn’t work on this video or it isn’t designed to work on videos in general. We tested the feature on Instagram by sending this reel to a demo account and it didn’t get blurred. Nor were we shown any prompt about potential nudity. This shows that the nudity protection mechanism applicable to images doesn’t work on videos.
Alt News has written to Meta about this. The story will be updated once we receive a response. We also reached out to the OnlyFans creator several times but they refused to speak to us.
Possibility of Exposure to Minors
We came across this video on Instagram’s Reels section, where the social media platform has implemented an algorithm that presents users with videos posted from accounts they do not follow. More importantly, these algorithms are black boxes and can display content to users based on various factors, which are hard to determine. Instagram allows 13-year-olds to register for accounts on the platform and has introduced various parental control features in recent years. It is possible that the video came up/comes up in the timeline of a minor.
It is to be noted that a 2021 study commissioned by the National Commission for Protection of Child Rights found that among 3,491 participating school-going children, many have accounts on major social networking apps/sites, among which Instagram (used by 45.50 per cent) and Facebook (used by 36.8 per cent) were the most popular.
Alt News spoke to a psychologist to understand the effects of exposure to such videos on a minor. Ananya Sinha, director and chief clinical psychologist at TherapHeal, said, “Exposure to such videos, portraying hardcore pornography containing abuse, rape or sexual manipulation or aggression, has a strong influence on adolescents’ sexually permissive attitudes. They tend to normalize sexual harm and aggression. We must remember, pornography is not only watched out of curiosity or pleasure, but sometimes acts as a source of information. If a video like this pops up on an adolescent’s timeline, they will start thinking that this is how sexual interaction is supposed to be”.
“On the other hand, when one unexpectedly gets exposed to such videos, they may leave a significant psychologically distressing impact on the viewer’s mind. Viewing such sexual violence could be traumatising and the associated fear and anxiety may stay with one for years. This is one reason why social media platforms have community guidelines. The existence of the video on the platform shows the abject failure of these,” she added.
We also spoke to an advocate practising in Delhi. She said, “Instagram not taking the video down for days is illegal and they should have faster turn-around time. They have stricter community guidelines than most other platforms. The reel depicts a non-consensual scene and should have been taken down immediately. Publishing or transmitting obscene material/ sexually explicit act in electronic form is punishable under section 67 of Information Technology Act, 2000.”
Meta Has Deprioritized Content Moderation
In the past, Meta has come under criticism for arbitrarily deleting/disabling entire accounts of erotic art creators and pole dancers It also disabled the account of a creator for featuring photos of them breastfeeding their child. It is also pertinent to note that Meta, the parent company of Facebook and Instagram, has deprioritized content moderation over the past year. It laid off nearly 200 content moderators in 2023. Additionally, more than 100 positions related to trust, integrity, and responsibility were reportedly abolished the same year. The moves came at a time when nearly 50 elections, affecting half the planet’s population, were scheduled for 2024.
Several media outlets have highlighted these while reporting on issues such as shrimp Jesus AI art, stolen AI-based images on Facebook, ads on Instagram selling drugs, stolen credit cards, hacked accounts, counterfeit money, weapons, and videos of minors doing sexual activity. Alt News, too, has written extensively on the failure of Meta’s content moderation, particularly in cases involving hate speech and depiction of violence.
Earlier this year, a police complaint filed in West Bengal had named Instagram as a co-accused in an offence under Section 12 of the Protection of Children from Sexual Offences (POCSO) Act and Section 67 (B) of the Information Technology (IT) Act of 2000. Responding to media queries, a spokesperson from Meta had then said that they took action on “content that violates our Community Guidelines or when it violates local law.”