On Friday, YouTube announced they would be “retooling” their recommendation algorithm in order to prevent people from being recommended the things they want to see (that YouTube doesn’t want them to see). In an effort to “prevent promoting conspiracies and false information”1 YouTube will now be more proactive about quelling what they see as “misinformation on the world’s largest video platform.”2

In a blog post, the company said it was investigating how it could “reduce the spread” of content that comes close to but does not violate its rules. So, just to catch you up, they want to limit things that aren’t against their rules, because it doesn’t line up with their “truth.”

“The change to the company’s recommendation algorithms is the result of a six-month-long technical effort. It will be small at first — YouTube said it would apply to less than 1 percent of the content of the site — and affects only English-language videos, meaning that much unwanted content will still slip through the cracks.

The company stressed that none of the videos would be deleted from YouTube. They would still be findable for people who search for them or subscribe to conspiracy-focused channels.”3

(So we should view them as champions of free-speech? Nah…I’m good.)

YouTube used to be a platform where basically anyone could participate but recently they’ve been clamping down and deleting entire pages. Some of which took years to create and had enormous content.

RELATED STORY:

“YouTube’s recommendation feature suggests new videos to users based on the videos they previously watched. The algorithm takes into account ‘watch time’ — or the amount of time people spend watching a video — and the number of views as factors in the decision to suggest a piece of content. If a video is viewed many times to the end, the company’s software may recognize it was a high-quality video and automatically start promoting it to others. Since 2016, the company has also incorporated satisfaction, likes, dislikes, and other metrics into its recommendation systems.

But from a mainstream video, the algorithm often takes a sharp turn to suggest extremist ideas. The Washington Post reported in December that YouTube continues to recommend hateful and conspiratorial videos that fuel racist and anti-Semitic content.”4

What do you think about YouTube’s desire to limit the type of content people want to see? Do you still use the platform? Many people view and label holistic medicine and the vaccine-choice stance as a conspiracy theory and for this reason, we pay special and careful attention to what they are doing.

Thankfully, right now YouTube is just attempting to make the things they think are conspiracies less easy to find and not deleting them altogether.

 

SOURCE:

  1. Washington Post
  2. Washington Post
  3. Washington Post
  4. Washington Post