Tenchu's Thoughts: What Exactly Are The "Algorithms"?Please tell me what you're referring to so I can understand what you're talking about. Sure thing.
The "algorithms" are what's driving social media these days. It's the primary engine that makes social media so appealing, so popular, and so profitable.
Basically it's the way YouTube, TikTok, Twitter/X, Reddit, and other popular social media platforms deliver content that they think you may be interested in. They do that by watching what you click on, how much time you spend watching a video or looking at a post, how much effort you spend commenting on it, and generally how much you engage with the content that is being presented. That information then gets fed back into the "algorithms" in order to refine what shows up on your feed.
The purpose of the "algorithms" is simple. Get you to keep clicking, keep watching, and keep engaging in the content. That way, the social media platforms can then drive the advertisements into your feed in a way that will increase your likelihood to click on those as well. Voila, the clicks increase, more clicks means more advertising revenue, and now your engagement has just been monetized.
And of course, these social media platforms will share some of that monetization with the creators of said content.

How the "Algorithms" Destroy Honest, Constructive Debate
Here's the problem with the "algorithms." They destroy honest, constructive debate. They cause all of us to get siloed into our own algorithmically-created cliques.
Are you a Democrat? The "algorithms" will steer you toward left-leaning channels like Legal Eagle, Dark Brandon, CNN, and MSNBC.
Are you a MAGA cultist? The "algorithms" will steer you toward Benny Johnson, Ben Shapiro, Jordan Peterson, FOXNews, and yes, Charlie Kirk.
But it's not just the "algorithms" that is causing the big divide. And this is the key. Content creators themselves will compromise their own intellectual integrity in order to cater to the "algorithms."
In other words, they will say whatever their viewers want them to say in order to keep the monetization train running. The moment they say anything that upsets their viewers, they will get a torrent of "dislikes" and "unsubscribes," which directly impacts the amount of money they get.
For example, let's say a left-wing channel says that Kamala Harris sucks. She has a history of running terrible campaigns, she has a real penchant for word salads, and she was only VP because she was a DEI hire. Point that out to your audience that is decidedly leans left, and they will go nuts. They will call you a racist, they will call you a Trump supporter, and they will unsubscribe from your channel and tell everyone else to do likewise. That directly impacts your income.
Or let's say you're the main personality of a right-leaning channel, and you say that Donald Trump sucks. He is no conservative, but rather a fascist with declining brain function and an unbelievably narcissistic personality. Point that out to your audience that is undoubtedly MAGA, and they will go nuts. They will call you a libtard, they will call you a "woke" Democrat, and they will unsubscribe from your channel and tell everyone else to do likewise. Once again, that directly impacts your income.
Come to think of it, I can't think of any moment in American history where there is almost a 1-1 correspondence between how popular your political views are in public forums and how much money you make from them. In the past, such an association would represent a CLEAR conflict of interest, but these days, we just accept it as the "new normal."
How Does This Relate to Charlie Kirk?
Well for starters, I noticed that the event where Charlie Kirk was assassinated was part of his "Prove Me Wrong" tour. That tour is nothing but bait for anyone who believes that this is an honest and open forum for debate. Charlie Kirk is a master at "gotchas," where he will try and nail someone for saying something that may be inconsistent or self-contradictory. But he himself often veers into territory where he is also being inconsistent and self-contradictory.
Even the final question that he was answering at the event, right before he was shot, was about transgendered shooters. Kirk was obviously advancing a right-wing narrative which claims that transgendered people are more likely to become mass shooters than anyone else. Intellectually speaking, that is an absurd idea to entertain, but from the POV of the "algorithms," it was exactly what his supporters wanted to hear.
The follow-up question was about how many mass shootings are out there. Even the response from Charlie Kirk was logically fallacious and unfairly dismissive when he retorted, "Counting or not counting gang violence?" This is a distraction and a dishonest way to throw off the guy who was asking the question. Why should this distinction matter, the difference between gang violence and non-gang violence? Is gang violence not part of the whole problem with gun violence? Why should we discount gang-related mass murders when we talk about all those "transgendered mass murderers"?
Unfortunately, right after that, Charlie Kirk was shot.
This is why I say that Charlie Kirk lived by the algorithms and died by the algorithms. His final message wasn't "one of love, of Jesus Christ, of his belief in prayer," despite what former Utah Rep. Jason Chaffetz claimed. (That, too, was a DISGUSTING display of dishonesty wrapped in the oh-so-holy image of Jesus Christ.)
Instead, his final message was "I'm right, you're wrong, neener neener." It was harmful, it was intellectually dishonest, and it was in a forum that was designed from the beginning to make challengers look like chumps. Yet that's exactly what puts food on the table for his wife and his two infant children.
When all of his expressed views are examined through the lens of the "algorithms," everything else he says makes more sense, at least from the POV of a self-serving content creator.
That includes his statement on gun violence and how a few extra deaths should be tolerated in order to maintain the 2nd amendment. I'm not sure he fully believed in that, which is why I won't condemn him (that much) for saying it.
That includes his statement on empathy and why he oddly drew a distinction between "empathy" and "sympathy." Again, that doesn't seem to mean anything, except in an "algorithm-friendly" way, so I won't use it as an excuse to not feel any empathy for him and his family. Or sympathy.
Now I'm sure that deep down, Charlie Kirk really was a Christian, a loving father, and a pro-family conservative who had some strongly-held beliefs. But I also believe the "algorithms" were wearing away at his intellectual integrity and his character. It held the potential to wear him away until there was nothing left but a facade, a personality, a soul that sought nothing but "likes" from the "right people."
So what?
You might counter and say, "So what? That's what FOXNews does. That's what MSNBC does. That's even what the NYTimes, the WSJ, and CBS does, don't they?" And you'd be right to an extent.
But traditional media isn't so directly connected to the "algorithms" as the new media is. It just doesn't fit the same way.
You see, traditional media has to cater to a wider audience. They can't custom-tailor each individual's news feed. The NYTimes, for example, can't just publish articles that one reader prefers while publishing an entire different set of articles for another reader.
Moreover, in traditional media, if a media outlet skews too far away from objective journalism, they will become too niche and too specialized to maintain a wide viewership.
Not so with social media "algorithms." Here, it pays VERY well to be biased, to be disingenuous, to be self-contradictory and loose with the facts.
Does that make Charlie Kirk a despicable liar? Maybe. But I also view him as the product of our social media age, someone who was tempted by the "algorithms" to twist his own views for the benefit of his followers. And he looked like he had fun doing it.
Like it or not, the "algorithms" will be shaping public debate for years to come. I personally don't like it, but what can I do? There are benefits to it, especially given how they enable pretty much anyone with an opinion to get access to a potentially huge audience.
But we have to recognize how the "algorithms" are also causing massive harm to our political environment. And I think that's what I want to try and focus on, starting from today and going on into the future, especially as AI starts to control these "algorithms."
Hope that explains things a bit, Heywood.
Tenchusatsu |