You often hear about “ideology” these days.
Even if that word isn’t mentioned, it’s very much what’s being discussed. When President Donald Trump denounces the left, he’s talking about gender ideology or critical race theory or DEI. When the left denounces Trump, they talk about fascism. Wherever you look, ideology is being used to explain or dismiss or justify policies.
Buried in much of this discourse is an unstated assumption that the real ideologues are on the other side. Often, to call someone “ideological” is to imply that they’re fanatical or dogmatic. But is that the best way to think about ideology? Do we really know what we’re talking about when we use the term? And is it possible that we’re all ideological, whether we know it or not?
Leor Zmigrod is a cognitive neuroscientist and the author of The Ideological Brain. Her book makes the case that our political beliefs aren’t just beliefs. They’re also neurological signatures, written into our neurons and reflexes, and over time those signatures change our brains. Zmigrod’s point isn’t that “brain is destiny,” but she is saying that our biology and our beliefs are interconnected in important ways.
I invited Zmigrod onto The Gray Area to talk about the biological roots of belief and whether something as complicated as ideology is reducible to the brain in this way. As always, there’s much more in the full podcast, so listen and follow The Gray Area on Apple Podcasts, Spotify, Pandora, or wherever you find podcasts. New episodes drop every Monday.
This interview has been edited for length and clarity.
What is ideology? How are you defining it?
I think ideology has two components. One is a very fixed doctrine, a set of descriptions about the world that’s very absolutist, that’s very black and white, and that is very resistant to evidence. An ideology will always have a certain kind of causal narrative about the world that describes what the world is like and also how we should act within that world. It gives prescriptions for how we should act, how we should think, how we should interact with other people. But that’s not the end of the story.
To think ideologically is both to have this fixed doctrine and also to have a very fixed identity that influences how you judge everyone. And that fixed identity stems from the fact that every ideology, every doctrine, will have believers and nonbelievers. So when you think ideologically, you’re really embracing those rigid identity categories and deciding to exclusively affiliate with people who believe in your ideology and reject anyone who doesn’t. The degree of ideological extremity can be mapped onto how hostile you are to anyone with differing beliefs, whether you’re willing to potentially harm people in the name of your ideology.
You write, “Not all stories are ideologies and not all forms of collective storytelling are rigid and oppressive.” How do you tell the difference? How do you, for instance, distinguish an ideology from a religion? Is there room for a distinction like that in your framework?
What I think about often is the difference between ideology and culture. Because culture can encompass eccentricities; it can encompass deviation, different kinds of traditions or patterns from the past, but it’s not about legislating what one can do or one can’t do.
The moment we detect an ideology is the moment when you have very rigid prescriptions about what is permissible and what is not permissible. And when you stop being able to tolerate any deviation, that’s when you’ve moved from culture, which can encompass a lot of deviation and reinterpretations, to ideology.
How do you test for cognitive flexibility versus rigidity?
In order to test someone’s cognitive rigidity or their flexibility, one of the most important things is not just to ask them, because people are terrible at knowing whether they’re rigid or flexible. The most rigid thinkers will tell you they’re fabulously flexible, and the most flexible thinkers will not know it. So that’s why we need to use these unconscious assessments, these cognitive tests and games that tap into your natural capacity to be adaptable or to resist change.
One test to do this is called the Wisconsin Card Sorting Test, which is a card-sorting game where people are presented with a deck of cards that they need to sort. And initially, they don’t know what the rule that governs the game is, so they try and figure it out. And quickly, they’ll realize that they should match the cards in their deck according to their color. So they’ll start putting a blue card with a blue card, a red card with a red card, and they’ll get affirmation that they’re doing it.
They start enacting this rule, adopting it, applying it again and again and again. And after a while, unbeknownst to them, the rule of the game changes and suddenly this color rule doesn’t work anymore. That’s the moment of change that I’m most interested in because some people will notice that change and they will adapt. They will then go looking for a different rule, and they’ll quickly figure out that they should actually sort the cards according to the shape of the objects on the card and they’ll follow this new rule. Those are very cognitively flexible individuals.
But there are other people who will notice that change and they will hate it. They will resist that change. They will try to say that it never happened, and they’ll try to apply the old rule, despite getting negative feedback. And those people that really resist the change are the most cognitively rigid people. They don’t like change. They don’t adapt their behavior when the evidence suggests that they do.
So if someone struggles to switch gears in a card-sorting game, that says something about their comfort with change and ambiguity in general. And someone who struggles with change and ambiguity in a card game will probably also have an aversion to something like pluralism in politics because their brain processes that as chaotic. Is that a fair summary of the argument?
Yeah, broadly. People who resist that change, who resist uncertainty, who like things to stay the same, when the rules change. They really don’t like it. Often that translates into the most cognitively rigid people, people who don’t like pluralism, who don’t like debate.
But that can really coexist on both sides of the political spectrum. When we’re talking about diversity, that can be a more politicized concept, and you can still find very rigid thinkers being very militant about certain ideas that we might say are progressive. So it’s quite nuanced.
It’s easy to understand why being extremely rigid would be a bad thing. But is it possible to be too flexible? If you’re just totally unmoored and permanently wide open and incapable of settling on anything, that seems bad in a different way, no?
What you’re talking about is a kind of immense persuadability, but that’s not exactly flexibility. There is a distinction there because being flexible is about updating your beliefs in light of credible evidence, not necessarily adopting a belief just because some authority says so. It’s about seeing the evidence and responding to it.
Focusing on rigidity does make a lot of sense, but is there a chance you risk pathologizing conviction? How do you draw the line between principled thinking and dogmatic thinking?
It’s not about pathologizing conviction, but it is about questioning what it means to believe in an idea without being willing to change your mind on it. And I think that there is a very fine line between what we call principles and what we call dogmas.
This gets particularly thorny in the moral domain. No one wants to be dogmatic, but it’s also hard to imagine any kind of moral clarity without something like a fixed commitment to certain principles or values. And what often happens is if we don’t like someone’s values, we’ll call them extremists or dogmatic. But if we like their values, we call them principled.
Yeah, and that’s why I think that a psychological approach to what it means to think ideologically helps us escape from that kind of slippery relativism. Because then it’s not just about, Oh, where is someone relative to us on certain issues on the political spectrum? It’s about thinking, Well, what does it mean to resist evidence?
There is a delicate path there where you can find a way to have a moral compass — maybe not the same absolutist moral clarity that ideologies try to convince you exists, but you can have a morality without having really dogmatic ideologies.
How much of our rigid thinking is just about our fear of uncertainty?
Ideologies are our brains’ way of solving the problem of uncertainty in the world because our brains are these incredible predictive organs. They’re trying to understand the world, looking for shortcuts wherever possible because it’s very complicated and very computationally expensive to figure out everything that’s happening in the world. Ideologies kind of hand that to you on a silver plate and they say, Here are all the rules for life. Here are all rules for social interaction. Here’s a description of all the causal mechanisms for how the world works. There you go. And you don’t need to do that hard labor of figuring it out all on your own.
That’s why ideologies can be incredibly tempting and seductive for our predictive brains that are trying to resolve uncertainty, that are trying to resolve ambiguities, that are just trying to understand the world in a coherent way. It’s a coping mechanism.
In the book, you argue that every worldview can be practiced extremely and dogmatically. I read that, and I just wondered if it leaves room for making normative judgments about different ideologies. Do you think every ideology is equally susceptible to extremist practices?
I sometimes get strong opposition from people saying, Well, my ideology is about love. It’s about generosity or about looking after others. The idea is that these positive ideologies should be immune from dogmatic and authoritarian ways of thinking. But this research isn’t about comparing ideologies as these big entities represented by many people. I’m asking if there are people within all these ideologies who are extremely rigid. And we do see that every ideology can be taken on militantly.
Not every ideology is equally violent or equally quick to impose rules on others, but every ideology that has this very strong utopian vision of what life and the world should be, or a very dystopian fear of where the world is going, all of those have a capacity to become extreme.
How do you think about causality here? Are some people just biologically prone to dogmatic thinking, or do they get possessed by ideologies that reshape their brain over time?
This is a fascinating question, and I think that causality goes both ways. I think there’s evidence that there are preexisting predispositions that propel some people to join ideological groups. And that when there is a trigger, they will be the first to run to the front of the line in support of the ideological cause.
But at the same time, as you become more extreme, more dogmatic, you are changed. The way you think about the world, the way you think about yourself, changes. You become more ritualistic, more narrow, more rigid in every realm of life. So yes, ideology also changes you.