What Does a Yellow Money Sign on YouTube Mean? (And How to Fix It)

Photo of author
Written By Thomas Smith

It’s the bane of every YouTuber’s existence.

You complete a great video for your channel or brand—maybe one that took you weeks to develop and hours to shoot and edit.

You upload it to the platform, key in your metadata, and wait for the flood of views—and ad revenue—to flood in.

Right after you hit publish, you see it: the dreaded yellow dollar sign.


A Yellow Dollar Sign Means No Monetization

Photo by Thomas Smith

YouTube’s yellow dollar signs are the evil cousin of the friendly green ones that gracefully monetized videos.

They indicate that Google’s algorithms have flagged your video as potentially inappropriate. As long as your video has a yellow dollar sign instead of a green one, no ads will show on it and it won’t earn any revenue.

Why do the yellow dollar signs exist?

It used to be that any channel with 10,000+ views could enable monetization and earn with ads. Sadly, that led to widespread abuse of the YouTube platform. Creators started uploading content that was at best schlocky and unhelpful and, at worst, downright abusive.

This came to a head in 2018 when a popular channel posted a video showing a dead body in a forest in Japan. Advertisers reasonably wondered why their ads were being shown beside such filth. And they started to leave the platform en masse—an event known among creators as the Adpocalypse.

To win them back, YouTube took drastic measures. It set the threshold for monetization at a much higher 1,000 subscribers. And it added much more aggressive AI-based measures to weed out inappropriate content. Thus were born the yellow dollar signs, which show that YouTube’s AI has an issue with your video.

Ultimately, it worked. Advertisers flocked back to the site, and CPMs (cost per thousand views) for legit creators went up. My own CPM tripled from around $4 before the Adpocalypse to about $13 today.

Update: As of 2022, it’s now around $16.


How Accurate is YouTube’s AI?

The challenge though is that YouTube’s moderation of new videos relies primarily on AI, and the AI isn’t perfect. Overall, I’d say it’s 95% there.

Depending on what’s been uploaded recently–or on YouTube’s own whims–it can get stricter or more lax, too.

The upshot for creators is that there’s a certain amount of uncertainty around YouTube’s process. You can upload a great video, press publish, and have it flagged by the AI. The video will remain on the platform, but you’ll be in yellow status and it won’t earn you a dime.

Lots of creators probably stop there. They figure their video must not have met some quality threshold and accept that creating it was a sunk cost and they’ll never get any revenue back.

Here’s the thing though. YouTube’s standards for what constitutes inappropriate are actually very clearly defined—and they’re very narrow.

Here’s the list:

  • Inappropriate language
  • Violence
  • Adult content
  • Harmful or dangerous acts
  • Hateful content
  • Incendiary and demeaning
  • Recreational drugs and drug-related content
  • Tobacco-related content
  • Firearms-related content
  • Controversial issues and sensitive events
  • Adult themes in family content

Most of these are fairly self-explanatory. It’s pretty easy to imagine what violence, adult content, hateful content, etc. looks like.

Others are looser.

  • Profanity is a gray area. Curse a couple of times in your video, and it might be fine. But curse throughout — or include profanity at the beginning or in your metadata — and you’ll probably be flagged.
  • Controversial issues is similarly loose. You can take a stance about an issue, but if your video is very political and polarizing, don’t expect advertisers to want to show their products beside it.
  • A few others are surprising and important to know about. Recreational drugs and firearms-related content are one example. If you’re shooting a how-to video or a cooking video and there’s a box of cigarettes on your work bench or a six-pack in your fridge, your video might get flagged.

False Positives

In many cases though, your videos might be flagged even though they have nothing to do with any of these topics.

Note that none of these community guidelines deal directly with video quality. There’s no flag for “boring” or “not terribly well produced.” Your videos should meet minimum quality standards, but poor production values alone generally won’t get you flagged.

So why do random videos get flagged? Let’s look at one in particular. Here’s a video that I shot about dealing with Amazon’s airbag packaging materials:

I think you’ll agree that there’s nothing hateful or pornographic in there. Yet this video got the dreaded yellow dollar sign treatment and was flagged by YouTube’s AI as inappropriate.

Why? I don’t know for sure, but here’s my guess. The popping of the bags makes a sudden, loud sound. YouTube’s AI may have mistaken it for gunfire and flagged my video for firearms content.

Likewise, my videos about the Dyson V7 Trigger vacuum cleaner get flagged a disproportionate amount. Three of the videos I uploaded for my nine videos in an hour experiment got flagged.

Why? The V7 Trigger looks like a gun, and I used the word trigger throughout the video. It’s solidly a vacuum cleaner and not a firearm, but YouTube’s AI didn’t necessarily know this.

As you can likely begin to see, the AI overall is pretty jumpy. If it sees anything in your video that concerns it, that’s a yellow dollar sign for you!


How do you fix the yellow dollar sign on YouTube?

That’s the bad news. And here’s the good news.

Initially, there wasn’t much you could do about videos flagged by YouTube’s AI. But after outcry from creators, YouTube stepped up and hired an army of human reviewers to take a closer look at flagged videos.

If your video gets flagged and you feel it was a mistake, there’s now a simple procedure you can follow.

  • Open the video in Creator Studio, and go to Monetization. You’ll see the reason your video was flagged. This is usually vague — something like “inappropriate content.”
  • Click on Request Review. Your video will go off to a human reviewer who will watch it and revise the AI’s decision if it made the wrong call. This process usually takes two to three days.
  • If your video was actually fine, monetization will be switched back on, and your orange dollar sign will become blessedly green.

If your flagged video complies with the community guidelines I listed above, don’t be shy about pressing the Request Review button. I’ve submitted lots of videos for human review, and I’ve never had one rejected. Every video I’ve submitted has had monetization re-enabled upon a human review.

Overall, I think this is an example of something YouTube has done really well. And it’s a case study that other platforms can learn from.

YouTube’s aggressive AI probably weeds out 99% of inappropriate videos. If my how-to about a vacuum cleaner can get flagged, I’m sure that genuine firearms-related content doesn’t stand a chance.

Rather than penalizing creators with this strict process, YouTube has created an easy, clear path for human review. This isn’t always the case — on many platforms if your content gets flagged there’s no recourse at all. Tons of Instagrammers have woken up to find their account disabled for reasons they never understand, for example.

By taking a strict approach with AI, YouTube keeps advertisers happy and CPMs high. But by investing in fast, easy-to-use human reviews, it avoids unfairly penalizing creators who do follow the rules.

So the next time you see an orange dollar sign on your video, fight back. If you’re complying with the community guidelines, request a human review and get yourself back to green!

Leave a Reply

Discover more from Thrive on Medium

Subscribe now to keep reading and get access to the full archive.

Continue reading