Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

YouTube is changing the way it moderates violent video game content

YouTube is changing up how it previously moderate violent video game content.

Starting today, video game violence will be treated by YouTube the same way any scripted content is treated. That means fewer restrictions for violence in gaming, with future game uploads requiring only approval instead of being age-restricted.

Google will still continue to protect viewers from videos of real-world violence, however.

"We know there’s a difference between real-world violence and scripted or simulated violence – such as what you see in movies, TV shows, or video games," said a YouTube representative today in an announcement of the changes, "so we want to make sure we’re enforcing our violent or graphic content policies consistently."

And the company says it may still age-restrict content if violent or gory imagery is the sole focus of the video. For example, if the video focuses entirely on the most graphically violent part of a video game. That being said, these changes have no effect on advertiser guidelines, so some particularly gratuitous depictions of violence in video games still may face limited or no ads.

Read this next