Categories
Products

YouTube’s advertisement guidelines restrict content creators

Release time:  2017-12-14 Release source:  Katherin Abando author:  ADNose browse:  517


YouTube has been known for being a platform that allows creative content to be shared with the rest of the world, but recently the restrictions and flagging of certain videos and content creators has raised tensions within online communities.


By deeming certain videos not fit for advertising according to its guidelines, YouTube has been removing ads from certain content creators trying to share their commentary on public issues, effectively leaving them with no revenue to continue creating.


These videos aren’t being monitored in the proper way, and as a result YouTube’s massive reach will be lost.


Known as the “adpocalypse,” ads from YouTube videos are removed because they violate ad-friendly policies that the company blankets across all videos.


Content on YouTube reaches a broad audience and deals with politics, gaming, entertainment and public issues. Removing ads from certain content holds YouTube creators back from practicing their work and sharing their voices.


YouTube’s policies consider sexually suggestive content, displays of violence with serious injuries and extremism events, inappropriate language, promotion of drugs and other substances and controversial or sensitive events relating to political conflicts, natural disasters and tragedies as non ad-friendly for content creators.


After the horrific shooting in Las Vegas, YouTuber Casey Neistat posted a video voicing his opinion on the tragedy and asked viewers to donate to victims of the shooting. Furthermore, Neistat mentioned that the ad revenue made from the video would also be donated.


Moments after the video was uploaded, Neistat was notified that his video would not be ran with advertisements because the content violated YouTube guidelines.


YouTube responded to Neistat’s complaint on Twitter saying “We love what you’re doing to help, but no matter the intent, our policy is to not run ads on videos about tragedies.”


Neistat is one of the most popular content creators, but YouTube stood by its policy and dismissed his content even though he focused on charitable contributions to the victims rather than the tragedy itself. The guidelines are too broad to actually distinguish between controversial topics and a call for charitable donations.


It might not be a huge concern to big YouTubers like Neistat, but it is for small channels and new up-and-coming creators who want to make money off their content while openly voicing their opinions.


Gaming YouTubers, in particular, are deeply affected by ad removals due to the explicit language and animated violence in shooting games like “Call of Duty” and “Battlefield,” which get featured in their videos.


With his 160,000 subscribers, Michael “Mtashed” Tash had 140 videos become ineligible for ads, leading him to consider utilizing another platform like Twitch or Patreon that would allow his commentary while playing “Counter Strike,” according to Forbes.


The amount of labor put into editing and filming a video might not be worth it in the end because it could be flagged.


For instance, some of Mtashed’s videos have reached 400,000 views. A video with approximately 47,000 views can earn him an estimated $188 at most, according to the social media statistic website Social Blade.


YouTube is supposed to welcome independent creators outside of mainstream channels to express their creativity and commentary, but these policies are making users question whether or not a career online is financially viable.


Another concern among YouTubers is that it’s not a human determining whether or not the videos are appropriate. It’s computer algorithms.


It seems the algorithms process violence in videogames the same way it would with the real thing. As a result, violence may be defined broadly because it comes in different forms. The guidelines might be too subjective because people have their own perspective of violence.


The algorithms might also be consuming so much information, but can’t stop for a second to recognize that the violence of war isn’t similar to what gamers are playing on their screens.


The same treatment may also applied to health topics, as the computer algorithm might fail to register what seems appropriate and what doesn’t. Dr. Aaron Carroll’s channel, which has discussed opioid abuse, treatment for diabetes and the cost of prescriptions had advertisements removed from 27 videos because the algorithm viewed Carroll positively promoting drug use.


Carroll was objectively stating facts, statistics and ways to seek treatment for people dealing with opioid abuse.


YouTube has the right to make sure its platform’s content adheres to guidelines, but its overbearing policies are unnecessarily hurting its users.


The “adpocalypse” is becoming a war between human creators and the algorithm policing their every word.


By making changes to its policies and the ways it detects inappropriate content, YouTube can keep their YouTubers from switching over to less restrictive platforms and instead, speak their minds freely while still making a profit.