"This past week, the company announced that it would expand that approach, so that a person who had watched a series of conspiracy theory videos would be nudged toward videos from more authoritative news sources. It also said that a January change to its algorithm to reduce the spread of so-called “borderline” videos had resulted in significantly less traffic to those videos.
In interviews, YouTube officials denied that the recommendation algorithm steered users to more extreme content. The company’s internal testing, they said, has found just the opposite — that users who watch one extreme video are, on average, recommended videos that reflect more moderate viewpoints. The officials declined to share this data, or give any specific examples of users who were shown more moderate videos after watching more extreme videos.
The officials stressed, however, that YouTube realized it had a responsibility to combat misinformation and extreme content.
“While we’ve made good progress, our work here is not done, and we will continue making more improvements this year,” a YouTube spokesman, Farshad Shadloo, said in a statement." |