Did you know Is YouTube’s Ad Moderation Failing to Protect Children from Inappropriate Content?
YouTube's advertisements continue to frustrate users, with many calling for significant changes. Issues range from deceptive ads promoting mobile games that don't match their advertised content, to sexually explicit ads that make viewers uncomfortable. Google claims it actively blocks inappropriate sexual content in its ads and ensures such ads are not shown to minors. However, a recent incident shared on Reddit sheds light on the inconsistency of these claims, particularly concerning the safety of young viewers.
A Reddit post, reported by Android Authority, highlights an incident where a seven-year-old child, while streaming content on YouTube, was shown an ad featuring a porn star who resembled the YouTube streamer Loserfruit. This uncomfortable situation was witnessed by a family member who was present, prompting concerns about YouTube's ad moderation system.
While it's easy to point fingers at Google, context is key. The child
was using their mother's phone, a device they do not regularly access,
and it remains unclear whether the child was logged into their own
YouTube account. YouTube’s policy states that users must be at least
thirteen years old to create an account, although this doesn’t mitigate
the issue of inappropriate content being shown to anyone, regardless of
age. For parents seeking a safer experience, the YouTube Kids app offers
a more controlled environment, and recent updates have made it more
similar to the regular YouTube app.
Google’s consistent failure
to filter out inappropriate ads, from explicit content to misleading
game advertisements, has led to growing frustration among users. This
frustration has contributed to a sharp increase in the use of ad blockers,
as users seek ways to avoid these intrusive and harmful ads. In
response, Google has taken measures to combat ad blockers, using tactics
like server-side ad injection to force ads on users, despite the fact
that these ads are often filled with misleading information, explicit
content, NSFW, or even malware.
Although the incident shared on Reddit may seem overblown, especially given that the child was using an adult's phone and wasn’t technically old enough to be on regular YouTube, the larger issue remains. YouTube’s ads have become increasingly unpopular, with both children and adults accidentally exposed to inappropriate content. Google’s inadequate AI-driven moderation system is a major contributor to this issue, and unless the company takes swift action to improve its ad filtering, it risks alienating more users. As complaints about YouTube's ads continue to grow, more people will direct their anger toward Google, regardless of who is technically responsible for these incidents.