Art in the age of brand safety: creators' struggle for authenticity

A surreal image representing self censorship.

The other day, a colleague told me about a YouTuber called WilliamSRD who recently informed his audience that he was in a predicament. He’d recently uploaded a video about a VR game called Wraith: The Oblivion - Afterlife, and while it was well received by his viewers, YouTube unexpectedly decided to slap it with the ‘age restricted’ label. Their rationale for doing so wasn’t entirely clear. The video wasn’t graphic or offensive, but the game’s setting did have some very dark themes. This put the creator in a bind. If he left the video unaltered it would likely be doomed to languish in the backwater of YouTube–the algorithm doesn’t like age-restricted content, after all. But if he went ahead and scrubbed anything remotely controversial, he’d arguably be disrespecting the source material. It would be like doing a video about Jurassic Park without mentioning any dinosaur attacks.

Table of Contents

The other day, a colleague told me about a YouTuber called WilliamSRD who recently informed his audience that he was in a predicament. He’d recently uploaded a video about a VR game called Wraith: The Oblivion - Afterlife, and while it was well received by his viewers, YouTube unexpectedly decided to slap it with the ‘age restricted’ label. Their rationale for doing so wasn’t entirely clear. The video wasn’t graphic or offensive, but the game’s setting did have some very dark themes. This put the creator in a bind. If he left the video unaltered it would likely be doomed to languish in the backwater of YouTube–the algorithm doesn’t like age-restricted content, after all. But if he went ahead and scrubbed anything remotely controversial, he’d arguably be disrespecting the source material. It would be like doing a video about Jurassic Park without mentioning any dinosaur attacks. 

While he was ultimately able to placate YouTube with strategic edits without having to gloss over the darkness of the source material, this episode highlights the difficult path creators have to walk. They’re caught between artistic freedom and pragmatism. 

Free speech on the web: the complexities of private platforms

In order to understand this dilemma, we need to understand how freedom of speech and the Internet intersect. The digital spaces that have become an increasingly important part of our lives are largely run by private entities. As such, they have considerable leeway to police the content that appears on their platforms. While this is often portrayed as a violation of America’s First Amendment, the First Amendment actually protects this behavior. Platforms like YouTube are private actors, and their ability to moderate the content is an outgrowth of their own First Amendment rights. 

This isn’t a new concept. The Supreme Court has long held that broadcasters and newspapers can exercise editorial discretion when deciding what to publicize (e.g., Columbia Broadcasting System v. Democratic National Committee and Miami Herald Publishing Company v. Tornillo). More recently, the 11th Circuit Court of Appeals explicitly applied this concept to social media companies in NetChoice LLC, et al v. Attorney General of Florida, et al., though this case is currently pending before the Supreme Court and may be overturned or limited. 

Advertisements are vital to platforms such as YouTube

From a platform’s perspective, content moderation is important because it helps them retain advertisers. For companies like YouTube which don’t have paywalls, ads are a vital source of revenue. But advertisers are a skittish bunch, and they don’t want to be associated with anything that could damage their brand. As my colleague Jason noted, everybody wants to cancel somebody, and advertisers don’t want to find themselves trending for all the wrong reasons. 

Advertiser anxieties: the challenge of content moderation

Of course, in their eagerness to keep advertisers happy, platforms can be overly cautious when deciding where to draw the line. In a followup post to the one I mentioned at the beginning of the post, WilliamSRD mentioned that YouTube’s scrutiny may have been triggered by mentions of a ghost called ‘The Hanged Man,’ presumably because it touches upon the topic of suicide.

If that’s true, it would seem to be an overzealous application of the rules. The relevant section of Google’s advertiser-friendly content guidelines seems to suggest that only graphic or extensive references to suicide should trigger restrictions. 

A system run by robots

Part of the problem here is that YouTube relies on algorithms to enforce these guidelines. It’s understandable given the sheer volume of content that’s uploaded every hour, but it likely explains the rigid way the rules are applied. While content creators can appeal YouTube’s assessments, the process can take time, and even if the restrictions are ultimately lifted, the damage has already been done. For most videos, the bulk of their views will come in within a few days of being uploaded. If your video has been sitting in the penalty box during that time, its earning potential is likely to be severely diminished.  

This isn’t just a YouTube problem. Creators working in almost any online medium can find themselves in similar situations. Tumblr caused a stir in 2018 when it banned ‘adult content,’ much to the dismay of the many marginalized communities that had come to view it as a safe space in which to explore and express sexuality. Like YouTube, Tumblr used automated tools to enforce its ban, and the results were often baffling. Ironically, even Tumblr’s own content ran afoul of its filter

Unfortunately, the rules aren’t always applied equitably. When PewDiePie was embroiled in controversy over his use of Nazi imagery and anti-Semitic ‘humor’ in 2017, YouTube was slow to react. They actually defended him at first, arguing that he was known for being provocative. Eventually, they clipped his wings…somewhat. They canceled the second season of his web series Scare PewDiePie, deleted some of his most problematic videos, and barred him from their premium revenue-sharing tier, yet he has over 111 million subscribers. 

Artistic integrity vs. money 

Bigger creators can probably afford to just carry on. They’ll likely have another video out before too long, and they may have diversified income streams that can insulate them from YouTube’s caprices. But not everyone is fortunate enough to be in that position. Creators who focus on long-form content may only be able to produce a new piece every month or so, which makes it harder to shake off penalties. 

This can leave smaller creators stuck. To escape the effects of being demonetized, they often have to bend over backwards to try to stay in the good graces of the algorithms. They may resort to bleeping out controversial words like ‘murder,’ ‘suicide,’ or ‘kill,’ or they may censor them by replacing a letter or two with asterisks (e.g., ‘m*rder’). They may even resort to circumlocutions, like substituting ‘unalived’ for ‘killed.’ While it’s not a perfect solution by any means, it may be the least-bad option.  

While frustrating, many creators will have little choice but to take the least-bad option. As long as platforms are reliant on advertising and allowed to exercise editorial discretion, they’re going to err on the side of caution when moderating content. This in turn means the creators who rely on them will have to mind their steps in order to avoid being penalized. But if awkward euphemisms or a spangling of asterisks can help creators stick to the artistic vision, it’s ultimately a win for them in the long run.    

Illustration of colorful books on a shelf against a dark background.