If only Karl Popper had a black belt in Jiu Jitsu, so he could beat Joe Rogan's ass
A rant about critical thinking, annoying but popular podcasts, the Covid Lab Leak Theory and lots of cognitive biases
I don’t know why I sometimes come up with these rants. But I think I just have to get them out of my system. If even one reader has a different perspective, or learned something about critical thinking after this, it has helped. For the record, I don’t really enjoy writing rants and I think I’ll try to minimize them. Here it goes.
Quite a few of my peers (intellectually, financially, etc) are into Joe Rogan, Lex Fridman, and the All-In Podcast. Broadly, what has been called the Intellectual Dark Web. The All-in Podcast is the less interesting one for me. After listening to about 5 episodes, it’s clear enough that the only thing interesting about it is getting a peek through the lenses of these billionaire fuckbois. They talk bullshit so confidently, that in cases where I don’t know the subject they sound smart, but every time they discuss something where I DO understand the subject they sound like complete idiots. Someone on Reddit had a funny take on the four stages of listening to the All-In Pod:
Stage one: man these guys are funny and must be smart.
Stage two: You start to notice that things they say don't quite add up, but they might know thing[s] you don't. Also it's good to get other perspectives.
Stage three: you start to realize there's something really off about the pod. Are these guys idiots? They can't be this stupid. They have so much money. Whats goin on?
Stage four: holy shit! These guys are in on the grift!!?! They just want money and power and will say anything to get it.
I came to the conclusion that not only are they annoying, they are under prepared, under informed on the topics they discuss, and maybe acting in bad faith. I’m not sure about the bad faith, because I like Henlen’s razor: “never ascribe to malice that which can be adequately explained by incompetence.”
So, let’s move on to Joe Rogan and Lex Fridman.
To me, what Joe Rogan and Fridman have in common is that they approach every guest with an open-mindedness so vast that it can almost be described as a void. They are two of the most soft-ball interviewers ever. This is presented as a feature, and I agree that this is their competitive advantage. They really allow their guests to present themselves in whatever way they want. This allows them to get some really famous guests who really let their guard down, which has created many genuinely interesting snippets. But ultimately I think a good interviewer has to toe the line between building rapport and challenging a guest. With Fridman and Rogan it feels more like they just desperately want to be friends with their guests.
But this is not what really annoys me about their content. What really annoys me about them is their intellectual style. By presenting orthodox, mainstream views as almost by definition wrong, and the act of questioning consensus as by definition virtuous, they contribute to an annoying problem with misinformation. Their listeners seem to be attracted greatly to this out-of-hand rejection of mainstream knowledge posing as intelligent questioning. This got to the annoying point that I had to actually debate (albeit briefly) one of my best friends on whether or not the moon landing was faked.
What they are doing is always the same, and the sequence is as follows:
We start from the premise that whatever is mainstream is probably false
We then search for alternative explanations that sound cool, interesting or entertaining
We then ask ourselves: “what if this was true?” and try to confirm that explanation with evidence
They almost deliberately set themselves up to fall victim to confirmation bias. Because with enough effort it is usually possible to make something that’s wrong sound at least plausible. And plausible is enough for them to claim probable. For example, it would be quite funny - not to mention very tasty - if eating steak every day were healthy. It’s fairly easy to cherrypick data here and there to weave a plausible and even scientific sounding narrative that eating steak every day is healthy.
Anyway, we are now crossing into the domain of epistemology, or the philosophical study of knowledge. It is a great personal interest of mine, and when I have a personal interest I tend to go deep, down, the rabbit hole. Let’s go!
Karl Popper was one of the founding figures of the scientific method. His most important contribution was to change science from being primarily inductively justificationist, i.e. this is true because it sounds plausible, to being primarily empirical. Another way to frame it is that scientific inquiry should aim to falsify hypotheses rather than confirm them.
Popper’s core epistemological theory is called Critical Rationalism, where theories are constantly subjected to scrutiny and potential refutation. This approach is about challenging ideas to improve or replace them with better ones. OH THE IRONY! This is exactly what Rogan and his fans think they are doing. But they don’t actually spend any time on challenging or falsifying anything, ever. The only thing they do is propose alternative ideas that they then try to justify. And you cannot falsify something by justifying an alternative. That is one form of a fallacy known as reversing the burden of proof.
Taking this to the carnivore diet, it makes no sense to try and inductively ‘proof’ that it’s beneficial. That’s not science. The question is not: “Someone claims eating nothing but steaks is healthy. How could that be true?” The right question is: “HOW COULD THAT BE FALSE?” Well, for one there is a mountain of evidence that eating lots of red meat is correlated to higher incidence of cancer, higher cholesterol, and heart disease. But the step that we’re missing is also that we need to falsify the current mainstream consensus and ask: “how could eating mostly plants be bad for you?” It turns out that it’s very hard to falsify the mainstream belief that: “eat food, not too much, mostly plants” is the most healthy diet.
Helpful reminder of some critical thinking mental models
Here is a quick primer on some of my favorite razors and fallacies. Let’s start with razors. A useful way to think is by elimination. If you don’t know the answer, you can at least try to narrow it down by crossing off answers that are clearly wrong. This thinking method can be conceptualized as a RAZOR. With a razor, you can shave off unlikely explanations to get closer to the truth. Some good ones are:
Occam’s Razor: the simplest explanation is preferable to one that is more complex. Simple theories are easier to verify. Simple solutions are easier to execute. Most conspiracy theories don’t make it past this razor.
Henlon’s Razor: One more time! Never attribute to malice that which can be adequately explained by incompetence.
Popper's falsifiability principle: For a theory to be considered scientific, it must be falsifiable.
The other fun category of thinking mistakes are often just called biases. Some of my favorite ones are:
Confirmation bias: People have a tendency to search for and favor information that matches what they believe, or want to believe.
Survivorship bias: The allies in WW2 were losing lots of bombers. They noticed that the returning bombers had bullet holes in the pattern as the image below shows. Therefore, the military planned to reinforce the planes with more armor in those areas that were frequently hit. Until a mathematician pointed out the mistake. These bombers did return, so getting shot in these places apparently does not take down a bomber. The ones that didn’t return were hit in the places that were spared on the surviving bombers.
Availability bias: More weight is given to recent observations. Something that happened recently, or that you happened to see is considered to be significantly more likely than in reality.
Salience bias: Things that are more striking, interesting or funny are given more weight than they deserve. Elon Musk’s fake ‘Razor’ is an example of this: “the most entertaining outcome is the most likely.” For the record, that is obviously not true.
Base rate neglect: A breathalyzer test always detects a drunk person, but has a 5% false positive rate. 1 in a 1000 drivers drives drunk. The police stop a random person, and the breathalyzer test shows them to be drunk. How likely is it that they are actually drunk. Given this question, people generally tend to estimate the probability at something like 90% but the true answer is 2%. How can that be? If 1,000 drivers were breathalyzed, there would be 1 true positive (the drunk) and 5%*999 = ~50 false positives. Therefore, the probability that a random driver who tests positive is truly drunk is 1/~50.
Gambler’s fallacy: The tendency to think that future probabilities are affected by past events.
Hindsight bias: The tendency to think “I knew it all along” even when that was not the case. Journaling helps a lot for this.
A seemingly large number of my friends believe the Covid Lab Leak Theory. I had not really looked closely into it myself. It seemed at least feasible, having spent some time in labs myself. Besides, dangerous biological agents have escaped from labs before. It’s become a highly politicized topic, and - not coincidental for me writing about it here - it’s become a bit of lithmus test. You can’t really be a Joe Rogan or All-In listener without believing Covid leaked from the Wuhan Virology lab. When I listened to Trump’s interview on the AllIn pod, Trump said matter of factly: “I said covid came from the Wuhan lab, I was right about that.” to approving nods of the Allin ‘besties’.
But after doing my own review I’ve concluded it is faaaaarrrrrrrr less likely than the mainstream theory that it naturally jumped from bats. Although it’s not possible to completely rule out the Lab leak, I would give it at least 100,000 to 1 odds compared to the mainstream theory. Briefly, I tried to write it all down here as a case study, but others have done a far better job than I could:
Have fun!
This is nicely written Gijs and really verbalises one of my long-time gripes very well.
As a boy, I would read articles in the paper on how drinking a glass of wine a day extends your life expectancy with 5 years on average. I remember thinking: “and how the fuck will you be backing that up with proper research?”. This held true for most articles about nutrition science — with many headlines contradicting each other.
The spread of misinformation is fuelled by a quest for the hugest viewer-/readership numbers. This game was brought up to the next level of silliness when it got injected with steroids as social media came out.
So much so, that my trust in the reporting of this science actually never recovered and I find myself instinctively skipping all articles on what you should eat or drink.
So interestingly enough I do find myself reverting back to intuition. But at least it’s my own, and hopefully limited in its bias due misinformation spread with adequate incompetence.
I guess that will stay that way until I find the time and patience to dive into the research myself.