YouTube’s video recommendation algorithm will keep giving out recommendations for videos so long as viewers keep watching. Overall, videos that answer users’ questions and keep viewers engaged will be most profitable for YouTube’s ranking algorithms. According to the Pew Research Center, 81% of U.S. YouTube users report watching videos recommended by the YouTube algorithm on a regular basis.
However, most YouTube users are signed in and get recommendations based on their viewing history. YouTube strongly tailors recommendations to the user’s history, and this is not something that can be mimicked universally. According to YouTube, there are a number of different factors that influence the videos ranked to choose for each person.
YouTube has several algorithms to rank content in home pages, search results, and suggested videos. The algorithms rank videos by how well they perform, and then pair them with potential viewers, according to multiple recommendations factors. Second, the YouTube algorithm matches videos with people based on their viewing history, as well as what similar people have watched. Like all similar platforms, YouTube uses complex algorithms to determine where videos should rank on YouTube’s recommendations and lists.
Ultimately, if its system favors conspiracy videos, YouTube is incentivized to make more. YouTube is concerned with how only one person reacts to any given video in making decisions on whether or not to recommend it to others. YouTube has said that YouTube is merely reflecting what users want to see, with videos chosen according to their personal profiles and viewing histories. To help keep users watching, YouTube uses a recommendation system powered by first-class AI (it is Google, after all).
Publicly, executives say its systems guide more than 70% of content watched on YouTube, and they are getting better at doing so all the time. New insights into how different factors affect YouTube’s video recommendation algorithm were revealed by members of the team in charge of working on it. In 2016, YouTube released a whitepaper explaining their machine-learning-based video recommendation algorithms. In the research paper published in 2016, a team of Google engineers shared their plans on how videos can be surfaced via YouTube’s recommendation engine to provide better user experiences.
To collect data about the particular recommendations made to YouTube users–information YouTube’s tech giant parent, Google, does not normally provide outside researchers–Mozilla took the crowdsourcing approach, via a browser extension (called RegretsReporter) that allows users to self-report which YouTube videos they regret watching. In 2012, YouTube announced an update to its search system designed to determine which videos people would actually like to see. Like Netflix, YouTube uses artificial intelligence to identify videos that are the best for viewers (or, at least, the people whose accounts are currently signed into them). YouTube’s chief product officer, Neal Mohan, admitted at last year’s CES that over 70% of videos that you view on YouTube are because of suggestions made by one of YouTube’s artificial intelligence-driven recommendations.
The problem becomes how to suggest new videos users will want to watch, when these videos are new to its algorithms and have a lower viewership. If you can convince a new user to keep watching more content after clicking on one of your videos, you increase the chances of having your videos recommended to them next time they open YouTube.