Recently, YouTube users discovered some sinister pieces of content, once again exposing the risk to user-generated videos, and more broadly, guaranteeing that the totality of the human condition would always reflect – along with evil -Good with -Anywhere the user is -Correct content online.
As reported by The Washington Post, a parent watching the cartoon with her son on YouTube Kids watched a turbulent clip in an otherwise spontaneous video for about five minutes. A man walked up to the screen and bitten his own wrist, saying, “Remember, kids, sideways for attention, long for results,” before walking back from the screen. The video then immediately reverts to the cartoon.
YouTube removed the video after it was reported,
YouTube removed the video after it was reported, but since then more users have discovered other cartoons with clips featuring the same cartoon, with explicit instructions on how to commit children to suicide, among other content.
That same month, Matt Watson, a video blogger, posted a video detailing how pedophiles are identifying and sharing timestamps for certain YouTube videos in which children are participating in activities such as playing twister or doing gymnastics.
He also showed that, if users click on such a video,
He also showed that, if users click on such a video, YouTube’s algorithm recommends similar people. According to Wired, the algorithms do not merely recommend other videos of children playing; They particularly suggest videos popular with other pedophiles.
Demonetisation was also being done with many advertisements for those videos. Companies including Disney and Nestle quickly pulled the plug on their advertising spend with YouTube.
Not that YouTube runs this terrible content, but their platform has made it easier than ever for the most amazing and most disgusting among us to express themselves publicly and with little restriction.
As Ben Thompson wrote a year ago,
And, for those wishing to see the downside of the Internet, focusing on the downsides without accepting the upsides is reward and massive. Future opportunities have to be endangered. We have to find a middle path, and neither side can do this without acknowledging and internalizing the other’s inevitable truth. “
The amount of police YouTube needs is extraordinary. He has built a stage that is carved into a kinnar, without effective moderation strategies to grow with him. And the result is not just advertiser dollars or hatred of companies when they are associated with these videos: it is deeply problematic and horrifying content that plagues children and innocent users.
A reactive moderation approach, based on flagged content and reports from users.
We have been contacted in the past by potential customers who want us to watch the first few minutes of a video submission or check the content every 20-30 seconds and then assume that the rest is safe, but in dollars Saving is simply not worth the risk (or potential reputation damage). For live streaming (the most risky type of UGC) we monitor video within 5 seconds of the content going live, and our AI solutions check for one frame per second. It’s hard to get right, and it takes a lot of training. This is a hard truth, but there is no effect in between.
Takeaway: Allowing the user to create videos on your platform involves risk.
Takeaway: Allowing the user to create videos on your platform involves risk. Exposure to innocent subjects to users, brands, consumers and advertisers. That risk needs to be managed as best as possible with UGC content moderation using a combination of AI and human review.
We have learned over 10 years in content moderation how to help customers identify the risks that occur when allowing user-generated video content, and to find a moderation plan appropriate to their needs. Think of us as your advisors, and reach out, before the amount of content gets too high and gets out of control (or before Apple kills you for not having a moderation plan from a store store). Because the genie cannot go back into the bottle, and there is no upside.