Not sure. Today, David Brooks calls these forces “anti-Enlightenment”:
In 2004, amazed that Bush could win re-election despite so many catastrophic blunders, I used the same construct in the subtitle for my blog, Sustainable Argument: “Defending Enlightenment Values Against a Right-Wing Cultural Revolution”.
By 2008, I had wised up. The blog was Two Tribes: “1. Just about everyone knows who they are going to vote for. Their reasons are often tribal rather than rational or based on strict adherence to a political or governing philosophy. 2. Elections are decided by the people who are left: People who either don’t care, who frequently aren’t qualified to judge, and who often make their choice based on irrelevant or even inane reasons.”
In day-to-day life, this might work to some degree:
Can We Reduce Political Polarization? | WGBH News
“People in the political domain suffer from what’s called ‘the illusion of explanatory depths,’ [meaning] they think they understand things that they just don’t,” Sloman explains. “We believe that this is one of the sources of the contentiousness in our political world. When we think we understand things, it seems to lead us to take strong, intransigent positions. But fortunately what we know is that it’s easy to break people of that feeling”
As Sloman indicates, people believe themselves to understand political positions — such as the issues that bifurcated the political parties during the election. In fact, people think that they espouse their political ideas because they understand what each given policy entails and how it would work.
But we understand less than we think — many of the policies we feel strongly about are incredibly complex, and most people don’t actually understand the specifics of how and why they might function in the real world.
It is this underlying uncertainty that can help us combat polarization. When approaching someone with an opposing viewpoint — say foreign relations with China — Sloman says you shouldn’t question their views directly, asking, “Why do you think we should stand up to China?” That approach actually spurs people to become more defensive.
Instead, to combat polarization, studies show you should ask people to explain how the policy that they believe in works — for example why standing up to China would help us create jobs.
“What you have to ask them is: ‘Why is it that you believe this policy is going to lead to the things that you want?’ So you have to ask people to unpack the causal mechanism that underlies the policy that you’re talking about,” Sloman explains. “What we find is that when we ask people those questions they have much less to say than they think they do.”
Once people realize that they don’t have a precise understanding of any given policy measure, they moderate their views and come to understand that the connection they believed to be so obvious is, in fact, a lot less clear. They also become open to new ideas — to different explanations about whether talking to China about changing their exchange rate really has much of an impact on creating jobs in New Jersey or Kansas.
Scholars who study this phenomenon say that one of the benefits of the research is that it helps people to understand what they know and don’t know. Once you understand the brain’s tendency to overestimate itself — whether we’re talking about comprehending the intricacies of tax policy or even something as simple as how a pen really works — it can help us to embrace a kind of humility that we didn’t see during the election season.
Anecdotal evidence for that:
Maybe things just have a beginning and an end: