I recently attended “RatFest 2025,” a small conference organized by the Conjecture Institute bringing together people who are sympathetic to critical rationalism—Karl Popper’s theory of science and knowledge.
It was fun! You can’t talk about whether abstractions are real, the correspondence theory of truth, the Duhem-Quine thesis, and the demarcation between science and metaphysics too often in daily life without severely hampering your dating prospects, so it was refreshing to be surrounded by people who couldn’t get enough of these topics. There were roughly 70 attendees—small enough to feel intimate, large enough for there to always be someone new to meet.
But despite the fun, I’m worried. I’m worried because I’m not sure if “critrats” (what proponents of critical rationalism like to call themselves) can avoid the hard problem that most communities face as they grow. A problem that we might call ideological ossification.
You would be hard pressed to find someone at RatFest who doesn’t believe in the many-worlds interpretation of quantum mechanics, or isn’t a libertarian, or thinks school is good, or takes evolutionary psychology seriously, or thinks AI poses a danger. Most people there would agree with the physicist David Deutsch, himself a student of Popper, about practically everything. Dissidents were there—John Horgan was one such voice—but they were few and far between.
This dynamic is to be expected, at least to some extent. Critrats are a group of people inspired by a particular set of ideas, so naturally their worldviews will be similar. And to be at all effective in the world, a community needs to share certain views. The libertarians won’t get very far if they start recruiting communists; unionists are unlikely to be anarchists; factory farming abolitionists probably won’t start questioning whether animals are sentient.
On the other hand, it’s easy for a community to become an ideological monoculture, wherein everyone thinks the same way and there are shared unquestionable, often unmentioned, premises. We’re all aware of this dynamic now, having been yelled at about echo chambers for several years. There are enough examples from across the political spectrum for you to pick your favorite one: BLM, MAGA, environmentalists, AI doomers, AI accelerationists, anti-natalists, anarcho-capitalists, communists, whatever.
Most communities are, I think, aware that its members tend to think the same way. Some praise this dynamic, convinced that they’re correct and that anyone who disagrees is a blasphemous moron. Political communities tend to be like this, as are most groups fighting for (what they’d describe as) “social justice.” And for some communities—those concerned with things besides politics and science and truth-seeking—this dynamic isn’t a problem. The local bowling community can be as much of a monoculture as it wants, and nobody is the worse for it.
Other communities recognize that groupthink is bad, and they do their best to fight it. In theory, a community can get around this problem by promoting particular epistemic values. They can support internal and external criticism. They can value humility and fallibility, discourage hero worship, avoid argument by authority or identity, and invite disagreement.
Of all communities I can think of, effective altruism (EA) comes the closest to promoting and upholding these kinds of norms. They pay people to criticize them. They hold essay contests asking people to change their minds. They litigate niche disagreements publicly on the EA forum. The comment section of Scott Alexander’s blog is like the garden of eden for calm and rational disagreement.
But despite all these healthy epistemic norms, EA has, from my perspective, effectively lost their minds over the past five years. They have gone from promoting distributing bed nets in subsaharan Africa to prioritizing shrimp welfare, death by AI superintelligence, trying to predict and influence the world one billion years from now, and unleashing a modern day Bernie Madoff onto the world.
Some people will disagree that these are bad things, or that EA is at fault for them. Fair enough, maybe I’m just wrong. But insofar as you agree with any of these examples, this should make you worried! To repeat: This community puts substantial effort into fostering healthy epistemic norms, more so than any other community I can think of. Plus they’re nice people! They’re thoughtful and smart. If they are susceptible to weird ideological trends like this, what hope is there for anyone else?
I find this genuinely troubling. Of course, I expect any community of people to get things wrong. But to my eye, EA has gone massively wrong. And more worryingly, they’re quickly becoming incorrigible.
Again—maybe you disagree with me about EA. Maybe you think they are perfectly undogmatic. But even then, you must marvel at how much effort is required to strike this balance between openness to criticism, and having a cohesive enough community to successfully do things in the world. Even if EA has solved this problem, it’s clearly a hard problem to solve.
Building a community requires cohesion among the members, and cohesion most easily comes from having similar ideas. But having similar ideas pushes you towards ideological conformity. That’s a difficult dynamic to navigate, and I hope the critrats are up to it.
Hi Ben,
It was great seeing you at RatFest! I enjoyed your blog and I understand your concern about community building. Still, I disagree with you on some key points, as I explain below.
Firstly, I think you’re being too lenient towards the EA crowd. For a movement about charitable giving, it has developed a remarkably self-referential culture. Polyamory is unusually common among EA-ers, which should raise eyebrows. How does a philosophy about effective altruism consistently attract, or perhaps create, people with the same unconventional relationship style? I’ve also met EA-ers with plainly irrational views, such as the belief that even mentioning Roko’s Basilisk to someone is not just wrong but reprehensible. At one EA conference I attended a couple of years ago, people presented a graph comparing the suffering of cows and crickets, complete with a y-axis labelled “amount of suffering” but no units. It’s the kind of mistake you learn to avoid in first-year science courses. Of course there couldn’t be any units, but then why have the graph at all? Nobody objected, and when I tried to raise the point, I was passed over. It was a simulacrum of science.
Any group of people who attend a conference will, naturally, share certain views and background assumptions. Physicists at a physics conference, or programmers at a tech conference, will agree on a great deal. That kind of cohesion isn’t a sign of irrationality. What matters is the community’s attitude toward disagreement, whether dissent is engaged with or quietly suppressed. In that respect, much of EA has drifted away from the scientific seriousness it claims to embody. It gestures towards rigour but rarely makes real contact with it.
RatFest, by contrast, didn’t feel like that at all. Genuine research scientists were in attendance, discussions were grounded, and obvious errors didn’t pass unchallenged. While many attendees shared similar views on certain topics, I didn’t sense hostility towards dissent or any in-group taboos. So if EA is the standard, RatFest far surpasses it!