top of page
Search

The Lawyer and the Scientist: Why Critical Thinking Is So Hard

Updated: Jan 2

We like to think of ourselves as rational creatures. We imagine our minds as truth-seeking instruments—mini scientists carefully weighing evidence, updating beliefs, and following facts wherever they lead. This is how we experience our thinking from the inside: measured, reasonable, and fair.

But this is largely an illusion.


From an evolutionary perspective, the human mind did not evolve to discover truth. It evolved to help us survive and reproduce. Truth was useful only insofar as it served those goals—and often, it didn’t. What we evolved instead is something closer to a lawyer’s mind than a scientist’s.


Survival First, Truth Second


Don't get me wrong, for most of human history, being wrong about the structure of the universe was costly and dangerous (think about getting it wrong about a fruit or a root being poisonous or an animal being dangerous or water being safe to drink etc.), but the greater cost usually came - and still comes - from being socially 'wrong'.


Basically, as social creatures, survival means in large part:


  • Maintaining your reputation (moving up not down the social hierarchy)

  • Staying within your group (avoiding social exile, which usually meant literal death)

  • Signalling loyalty, virtue, and shared values (again, to stay in the group)

  • Avoiding internal conflict so we are more convincing to others who are very good at detecting lying (we don't like lying very much and we are in fact very good at deceiving ourselves that we do really believe something because this makes us better at convincing others we believe it, see Robert Trivers' famous work on self-deception and how it evolved)


This is how we've ended up with a situation where we are very motivated to believe what the 'tribe' believes (in our modern world this is the group we think is 'cool' or likely to give us social standing), and very motivated to convince ourselves of things we want to be true and very good at convincing ourselves that we haven't convinced ourselves.



The Lawyer vs the Scientist


A useful way to understand this is that our mind can work both like a lawyer and/or a scientist. (This idea is building off ideas from Jonathan Haidt, particularly his elephant and rider analogy in The Righteous Mind).


The scientist seeks truth, even when it’s uncomfortable, actively looks for disconfirming evidence, and changes their mind when the data demands it. The lawyer starts with a conclusion, selects evidence to support it, dismisses or attacks opposing arguments, and experiences doubt as weakness or danger


Most of us like to believe we are scientists but we're mainly lawyers.


One of the most sobering findings in psychology is that higher intelligence doesn’t protect against bias. In some cases, it makes things worse because intelligence improves our ability to construct compelling justifications for what we want to believe. This is why highly educated people can hold wildly confident, contradictory beliefs—each sincerely convinced that reason and evidence are on their side. The feeling of “I’ve thought this through carefully” is not reliable evidence that we have.


Virtue, Identity, and In-Group Loyalty


Many of our strongest beliefs are not really about facts at all. They are about identity.

Beliefs function as social signals:

  • This is the kind of person I am

  • This is the group I belong to

  • These are the people I stand against


Changing your mind, then, isn’t just an intellectual act. It can feel like:

  • Betrayal

  • Status loss

  • Moral failure

  • Social risk


From an evolutionary standpoint, this makes sense. A flexible truth-seeker who constantly revised their beliefs may have been accurate but alone.


Scientific thinking runs against almost every instinct we have: It rewards doubt over certainty, it prioritises accuracy over belonging, it values discomfort over reassurance, it requires us to tolerate being wrong. No wonder it feels effortful, slow, and emotionally unpleasant. The default mode of the human mind is not “What’s true?”It’s “How do I justify what I already believe?”


If we recognise that our minds are biased by design, motivated by social survival, and prone to self-deception, then we can start to compensate. In fact I have come up with a kind of critical thinking tool that I guess I can call Clarke's Law which is: Whenever you are thinking about an issue and weighing up opposing arguments, think about which one you want to be true and then add proportionate weight to the other side. In other words, try to factor in your bias, when weighing up the case for and against something.




Why would we do this? It's Altruistic


A reasonable person might ask at this point - but why would I want to act like a scientist? If acting like a lawyer helps me more, then shouldn't I just act like a lawyer?


And my answer to that question is 'Yes, if all you care about is yourself, and getting ahead in life, you should absolutely act like a lawyer most of the time (except in the few cases where truth seeking really benefits you). If on the other hand you want to be a good person, a person who helps other people and helps the world become a better place, you should try to think and act like a scientist as often as possible. You may lose some friends along the way, but in the long run the world will be a far better place.


Think about people in history who have stubbornly acted like scientists at great personal cost. Think about Galileo with his heliocentricism going up against the Catholic Church and being placed under house arrest and having all his books banned. Think of Charles Darwin who delayed publishing his Origin of the Species for 20 years because he knew what it implied. Think about Ignaz Semmelweis who discovered that handwashing dramatically reduced maternal deaths from puerperal fever or Ludwig Boltzmann who championed the atomic theory of matter when many prominent physicists rejected it as metaphysical speculation. He died by suicide in 1906 just a few years before atomic theory was experimentally confirmed.


The pattern in all these cases is that the truth was unpopular and being a scientist was very costly. But...and this is the good news...the other pattern throughout history is that the truth almost always wins out eventually but it takes far longer than it should and millions of people suffer and die in the meantime.


This is why the scientific method - a disciplined way of being wrong less often - was invented and why it has been so powerful in bringing large portions of the world out of poverty and into lives of abundance. But the key word here is discipline. It takes discipline to think scientifically. It's not natural. It's like writing left handed if you're right handed.


Thinking like a scientist is both intellectually and socially difficult but it is ultimately an act of love and service.


Be a scientist.


 
 

Recent Posts

See All
bottom of page