What is the “Momo Challenge”?
To fully understand what’s going on with this, you first have to understand a few other things, namely:
- How the YouTube Kids app works;
- How YouTube makes money;
- How algorithms recommend content in “Recommended Videos”;
- Customizable thumbnails; and,
- The financial incentives to making trash videos.
After we look into how these things work, we can look into how nefarious and malicious videos could get recommended to children. From that point, we will turn to the specific “Momo Challenge” issue.
The YouTube Kids App
YouTube, knowing that many of the top-viewed videos on its website are videos targeted for children, created an application in 2015 with a simplified user interface. It is more focused on young users and primarily showcases content that is children-friendly. Many of the videos promoted on YT Kids are things like Toy Unboxing videos, fan-made videos about characters that are popular from children’s shows, or clips from children’s shows. (https://www.youtube.com/yt/kids/)
Given that YouTube Kids is used by children (and often toddlers), it is often the most attractive thumbnails in the recommended videos section that get clicked on.
How YouTube Makes Money
Ads. YouTube serves ads on its website and its videos. It takes a share of the revenue from the ad and gives the rest to the producer. There is scarce clear information on exactly how much YouTube takes, but estimations put it from 40-60%. (https://support.google.com/youtube/answer/72902?hl=en).
By making money from advertising revenue, YouTube is incentivized to show users more and more videos. Each video has the potential to make YouTube some money, so it is therefore in their interests to maximize the number of videos that someone watches so that the number of ads that they see is also maximized.
How Algorithms Recommend Content in “Recommended Videos”
When you publish a YouTube video, you get to specify the title, tags, category, and description. These are generally referred to as the video’s metadata. This metadata is one part of what YouTube probably uses to recommend videos.
Videos that have a lot of views are also more likely to be recommended.
In addition to metadata, another huge area of data to recommend videos based upon would be:
- The types of video that the user themselves click on; and
- The types of videos that other users watched after they watched that video.
How YouTube’s algorithm works is a closely guarded secret. One can only assume that these bits of data are major factors in which YouTube videos are displayed in the “recommended videos” section.
For the sake of illustration, let’s suppose you want to create a YouTube video that gets a lot of views. If your target audience is children who are into the movie Frozen, you can set the metadata to something related to this. Your video will then be more likely to be recommended.
When you upload a YouTube video, by default the thumbnail will be a screenshot from the video you have uploaded. This can be customized and changed. While this is regulated in some ways by YouTube, you could specify a customized thumbnail that has little to do with the actual content of the video.
The Financial Incentives to Making Trash Videos
Given what I’ve set out above, you can see how to create YouTube videos that will be picked up by the algorithm and how these videos can be profitable.
It is therefore clear to see that individuals would make videos that are slightly different from each other if one is already suggested.
Each view on these videos contributes to the video creator’s earnings.
Nefarious and Malicious Videos
Some people are just awful and want to troll.
They will create channels masquerading as children’s content, but then edit in threatening or messed up clips. These include clips related to the “Momo Challenge” and other things that are not appropriate for children, ranging from sexualized content, violent content, or others.
These videos are not hacking YouTube. They are gaming and taking advantage of the way that YouTube works.
The comment sections to children’s videos, and sometimes the videos themselves, have included phone numbers and contact details for someone claiming to be “Momo”.
The actual Momo image is a sculpture created by a Japanese artist.
Once these numbers are texted, usually through WhatsApp, they reply with gory and otherwise scary photos, and escalating challenges and threats. These have been reported to include things ranging from turning on a stove at night to committing suicide. The recipients of these messages, children, are threatened to not talk about this, especially not to their parents.
It is worth noting that many of these claims are unproven, however, scary or threatening imagery in videos have been confirmed.
Moral of the Story: What Parents Can Do
Understand that applications like YouTube Kids work based on algorithms. Don’t rely upon them solely for your child’s entertainment.
Consider curating YouTube videos, or other content and letting your child watching a pre-screened list.
Proactively talk to your child about the internet. Make sure that they know that it’s separate from them and that they can tell you about anything that they see online. Check in with them regularly, and consider monitoring everything that your kids do online.
Bryan Crockett is a communications consultant and paralegal student in Oshawa, ON.