Hundreds of Instagram accounts push graphic real-life violence to millions, CBS News finds
Matthew Stevens didn’t want to watch anyone die.
Late one night in February, the 19-year-old was in bed, scrolling on Instagram. Amid his feed’s usual mix of memes and movie clips, the app recommended a video of two men fighting — a brutal, graphic brawl that left him shaken.
He quickly scrolled past the post, but the Instagram algorithm kept showing him more.
“Every time you scrolled, it just kept going further and further with more graphic and violent fights,” Stevens said. Online, users worldwide reported the same experience: On Feb. 26, their feeds were suddenly filled with video after video showing real human suffering and death.
Instagram’s parent company, Meta, apologized after the incident, which affected Instagram Reels, the company’s short-form video feature similar to TikTok. A spokesperson said they’d “fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended.”
But Feb. 26 was the tip of the iceberg. Violent and shocking content often referred to as “gore” remains easily accessible on Instagram today, a CBS News investigation found.
Instagram Reels hosts a thriving gray‑market economy driven by graphic violence, people who post violent content on Instagram told CBS News. Banned advertisers — gambling sites, crypto apps, porn agencies — evade Meta’s ad rules by paying graphic-content pages to embed illicit promotions among the gore.
Between February and April 2025, CBS News identified more than 600 Instagram accounts that post real-world violence packaged into short-form meme videos. The accounts post horrific videos that are viewed by millions of users – some of whom told CBS News they didn’t want to see them.
Even elementary school students were exposed to violence on Instagram, said Nate Webb, a school counselor in Tooele, Utah. On Feb. 26, he said some fourth graders were joking around about the content they were seeing.
“It was shocking to me that such graphic content disturbs me, and I’m a full-grown adult, that didn’t really faze some of those kids,” he said. “It made me wonder, how much more are you seeing and how much more often are you seeing it?”
The violent imagery can “saturate” a person, overwhelming the brain and even leading to PTSD-like symptoms, said Laura Van Dernoot Lipsky, an expert on trauma who founded the Trauma Stewardship Institute.
“There are no words to describe how profound and damaging and powerful that impact is, even if we’re just bringing it down to neuroscience and a brain perspective,” she said.
It can be especially troubling if someone is shown gore content involuntarily, according to Joanne Lloyd, a researcher at the University of Wolverhampton in England.
“The people that seem to be most distressed by it were the people who had not gone looking for it,” Lloyd said.
CBS News interviewed Instagram users around the world and reviewed dozens of social media posts complaining about graphic content on the platform. Each said the Instagram algorithm showed them horrific violence they didn’t want to see — both before and after the incident in February.
Meta declined to make anyone available for interview for this story, but said in a statement that it invests heavily in safety and security, with 40,000 staff and more than $30 billion dedicated to those issues over the last decade. The company said it restricts the monetization of violent content and adds warning labels, aiming to shield teens and those who don’t want to see graphic posts – while acknowledging that not all disturbing material meets their threshold for removal.
In addition, Meta said it expanded teen protections in an October policy update, nowautomatically hiding more graphic content such as dead bodies, medical injuries, and dyinganimals. Teens will also be blocked from following or interacting with accounts that share age-inappropriate material, including those linked to adult platforms such as OnlyFans.