YouTube wants to stop suggesting videos of fake conspiracies and fake theories.
Since even the most harmless searches on YouTube can algorithmically lure you into videos containing claims of alien cultures living among us, the company decided to take action. YouTube Kids app does not escape this situation either, as such videos appear there.
Google streaming service says it will not offer "marginal" videos that acrobatic in its terms YouTube Community Guidelines or those who "misinform users". Examples of such types of videos are those that refer to flat Earth theories, misinformation about the collapse of the twin towers on 9/11, miraculous cures for major diseases, etc.
A algorithm will decide which videos will not appear in the recommendations, playing a balance between freedom of speech and maintaining YouTube's responsibility to users. For some this may be a questionable decision, given that when algorithms are tasked with selecting something, they usually create problems. However the platform launches, the new policy will be enforced gradually, starting with a small number of videos in the US, before rolling out globally as the algorithm learns and becomes more refined.
According to YouTube the decision affects just under 1% of videos, but given the large number of clips on the platform, the movement it will impact millions of videos.
But "marginal" videos will still appear on searching results and you'll see them in your recommendations if you've registered one channel which contains such content.