YouTube said Tuesday that the police body camera video from the school shooting Monday in Nashville, Tennessee, would normally violate its policy against graphic violence but that the platform will leave the video online with certain safeguards.
The Google-owned company said the video was in the public interest as it can educate people about what happened during the shooting incident.
The company also said it was monitoring its platform for videos, livestreams and comments that would glorify the violence in violation of YouTube’s rules.
“Following the tragic attack in Nashville Tennessee, some footage released by the Nashville Police Department has been age-restricted with a warning interstitial because of its graphic nature and will remain on YouTube as it is in the public interest,” Jack Malon, a YouTube spokesperson, said in a statement.
“Additionally, to ensure people are connected with high-quality information about this unfolding news event, our systems are prominently surfacing videos from authoritative sources in search and recommendations, including by surfacing on our homepage as well as the Top News shelf above related search results,” he said.
Facebook presented the video in a similarly restricted way Tuesday: with a warning that it contained graphic content requiring two clicks to see the video. Meta, Facebook’s parent company, did not immediately respond to a request for comment.
The Metropolitan Nashville Police Department posted about six minutes of footage on the department’s YouTube and Facebook pages Tuesday morning, combining views from two officers’ body cameras. The YouTube video had more than 1 million views as of Tuesday afternoon.
The video shows the moment that Officers Rex Engelbert and Michael Collazo confronted and killed the shooter who killed six people, including three 9-year-olds, at The Covenant School. Part of the shooter’s body is blurred in the moments depicting and following the shooting.
YouTube policy bans content depicting “road accidents, natural disasters, war aftermath, terrorist attack aftermath, street fights, physical attacks, immolation, torture, corpses, protests or riots, robberies, medical procedures, or other such scenarios with the intent to shock or disgust viewers.” The ban also covers “footage of crimes” when there is no education for viewers.
Meta has a nearly identical policy: banning content that is particularly graphic, but allowing it with some limitations to help people condemn violence or raise awareness.
YouTube put several hurdles in the way of viewing the footage: Users must say that they’re at least 18 years old, and they must click through an interstitial message noting that the content had been identified as inappropriate for some audiences.
YouTube has used similar restrictions in the past, including in January when Memphis police released body camera footage of the assault on Tyre Nichols.
Videos of mass shootings have been a tricky problem for tech platforms such as YouTube, Facebook and Twitter.
Most platforms have banned the reposting of videos created by shooters themselves, which are produced to encourage and glorify violence.
Those include videos from shootings in Buffalo, New York, and Christchurch, New Zealand, both of which were livestreamed by the shooters themselves.
Tech even set up an industry group, the Global Internet Forum to Counter Terrorism, to help coordinate their defenses against extremists.
Body camera footage and security footage of notable incidents, though, has been treated more as potentially more valuable by platforms, despite it oftentimes depicting violence, killings, police brutality, and other sensitive media.
In 2020, for example, at the height of Black Lives Matter protests nationwide, YouTube left up videos showing the murder of George Floyd in Minneapolis.