close
close

An alarming report shows that Instagram is recommending sexual videos to teen users aged 13 and over

An alarming report shows that Instagram is recommending sexual videos to teen users aged 13 and over

Instagram Reels reportedly began recommending sexual videos to test accounts.

Instagram’s algorithm regularly serves sexually charged videos featuring scantily clad sexual content creators to teen users ages 13 and older, according to the alarming results of a seven-month analysis released Thursday.

The Wall Street Journal and a Northeastern University researcher examined the Mark Zuckerberg-led app’s filters by creating accounts posing as fictional 13-year-olds and scrolling through Instagram’s Reels video feed – which reportedly almost immediately started presenting suggestive content.

The offensive videos initially featured women dancing suggestively or showing off their breasts, the report said.

Instagram Reels has reportedly started recommending sexual videos posing as children to test accounts. Ink Drops – stock.adobe.comInstagram Reels has reportedly started recommending sexual videos posing as children to test accounts. Ink Drops – stock.adobe.com

Instagram Reels has reportedly started recommending sexual videos posing as children to test accounts. Ink Drops – stock.adobe.com

Suggestive videos showed women dancing suggestively or showing off their breasts, the report said.Suggestive videos showed women dancing suggestively or showing off their breasts, the report said.

Suggestive videos showed women dancing suggestively or showing off their breasts, the report said.

The more the accounts watched these videos while skipping over others, the more graphic the content became – videos featuring online sex workers promising to send nude pictures to viewers appeared in “less than 20 minutes,” according to the Journal.

In a series of tests conducted in June, the Journal said Instagram began showing “video after video about anal sex” from a fictional 13-year-old who had previously watched videos about women on the Reels feed .

In other cases, the recommended algorithm returned videos of women caressing their bodies, imitating sexual acts or even showing their genitals to the camera, the Journal said.

According to the report, the offensive videos occasionally appeared in advertisements for major corporate brands.

Meta dismissed the report’s findings, with spokesman Andy Stone claiming it was “an artificial experiment that does not reflect the reality of how teenagers use Instagram.”

“As part of our long-standing work on youth issues, we have set out to further reduce the amount of sensitive content teens may see on Instagram, and have significantly reduced these numbers in recent months,” Stone added.

The post requested comment.

The offensive videos occasionally appeared alongside ads for major corporate brandsThe offensive videos occasionally appeared alongside ads for major corporate brands

The offensive videos occasionally appeared alongside ads for major corporate brands

Meta is facing mounting legal problems over its alleged failure to protect young users. Davide Angelini – stock.adobe.comMeta is facing mounting legal problems over its alleged failure to protect young users. Davide Angelini – stock.adobe.com

Meta is facing mounting legal problems over its alleged failure to protect young users. Davide Angelini – stock.adobe.com

According to the report, the Journal’s analysis was conducted over a seven-month period from January to June. Computer science professor Laura Edelson from Northeastern University also repeated the test results.

The test accounts did not follow any other accounts or like any posts. To test how quickly Instagram spread illegal recommendations, testers scrolled through Reels and watched the sexually charged videos while skipping others.

The recommended algorithm returned videos of women caressing their bodies, imitating sexual acts or even showing their genitals to the camera, the Journal said.The recommended algorithm returned videos of women caressing their bodies, imitating sexual acts or even showing their genitals to the camera, the Journal said.

The recommended algorithm returned videos of women caressing their bodies, imitating sexual acts or even showing their genitals to the camera, the Journal said.

The Journal said it conducted similar tests on Snapchat and TikTok, neither of which recommended sexually graphic content to the test accounts under similar conditions.

Meanwhile, current and former Meta employees told the medium that internal tests had already uncovered problems with the provision of inappropriate content for underage users in 2021.

In a 2022 internal report, Meta reportedly found that teen users viewed three times as many posts containing nudity as adults.

Meta has said it wants to provide teens with “age-appropriate experiences” on its apps. Nattakorn – stock.adobe.comMeta has said it wants to provide teens with “age-appropriate experiences” on its apps. Nattakorn – stock.adobe.com

Meta has said it wants to provide teens with “age-appropriate experiences” on its apps. Nattakorn – stock.adobe.com

Meta has repeatedly said it takes steps to ensure teen users have “age-appropriate experiences” with its apps.

The Journal uncovered the illegal content even as Meta introduced stricter content controls in January to prevent teen users from being exposed to inappropriate content.

Under these new restrictions, users under the age of 16 will be banned from seeing sexually explicit content in their feeds.

The damning report represents yet another problem for Meta, which is currently facing a sweeping federal lawsuit from dozens of states alleging the company’s apps fueled a mental health crisis among youth.

The Journal found that content from online sex workers appeared in test account feeds in less than 20 minutes. Jacob Lund – stock.adobe.comThe Journal found that content from online sex workers appeared in test account feeds in less than 20 minutes. Jacob Lund – stock.adobe.com

The Journal found that content from online sex workers appeared in test account feeds in less than 20 minutes. Jacob Lund – stock.adobe.com

Meta is also being sued by the state of New Mexico in a separate lawsuit alleging the company failed to protect underage users from sexual predators active on its apps.

As The Post reported, a filing in that lawsuit revealed that executives at Walmart and Tinder parent company Match Group confronted Meta after learning that their ads were running next to content that sexualized underage users.

In January, Meta CEO Mark Zuckerberg delivered a stunning apology to the families of online child abuse victims at a high-profile hearing on Capitol Hill.

“No one should have to go through the suffering of their families,” Zuckerberg said at the time. “And that’s why we are investing so much and will continue to make an industry-wide effort to ensure that no one has to endure the suffering that your families have endured.”

Related Post