Software engineering moves fast, yet many of its guiding principles remain suspiciously still. Adam Grant’s “Think Again” offers a lens through which to examine three habits that can make or break a team: intellectual humility, psychological safety, and the willingness to retire practices that no longer serve us.


I recently picked up Think Again: The Power of Knowing What You Don’t Know by Adam Grant, published in 2021. It is a book I recommend without reservation. Grant, an organizational psychologist, makes a deceptively simple argument: the world rewards people who can change their minds well, and punishes those who cling to outdated beliefs regardless of evidence. The book is full of research and stories, but what struck me most was how naturally its ideas map onto the daily life of a software engineering team.

What follows is my attempt to trace that mapping across three themes: embracing what we do not know, building environments where honesty is safe, and questioning the very notion of a “best” practice.

Embrace Intellectual Humility

Grant defines intellectual humility as the space between confidence and doubt — enough confidence to act, enough doubt to reconsider. In a field where last year’s breakthrough is this year’s legacy, the balance matters more than most of us care to admit.

The danger is what he calls “sophomore knowledge”: the stage at which a developer has learned enough to feel expert, but not enough to see the boundaries of that expertise. We have all met — or been — the engineer who dismisses an unfamiliar approach because it does not match the mental model built over the last two or three projects. The antidote is not less confidence, but a different kind: the confidence to say I might be wrong about this, let me look again.

In practice this means welcoming code reviews not as gates to pass but as conversations to learn from, treating user feedback as data rather than noise, and recognising that the depth of software engineering always exceeds what any single person can hold. Teams that internalise this tend to avoid the worst forms of overconfidence — the ones that lead to architectures nobody dares question until they collapse under their own weight.

Cultivate Psychological Safety

Amy Edmondson, a professor at Harvard Business School, coined the term psychological safety to describe a shared belief that a team is safe for interpersonal risk-taking. Grant builds on her work to argue that without this safety net, rethinking never even begins: people simply keep quiet.

The connection to software engineering is direct. A developer who fears ridicule will not flag a subtle bug during a code review. A junior engineer who fears blame will not admit a deployment mistake early enough to contain it. A team that punishes dissent will converge on the loudest opinion, not the best one.

The inverse is equally true. When people feel safe, they speak up sooner, share half-formed ideas that sometimes turn out to be brilliant, and treat failures as post-mortem material rather than career threats. None of this is sentimental; it is mechanical. Early signals travel faster in safe environments, and faster signals mean cheaper fixes, better designs, and fewer surprises in production.

Building this kind of culture is not a matter of posters on the wall. It requires leaders who visibly admit their own mistakes, who respond to bad news with curiosity instead of blame, and who make it clear — through action, not policy — that raising a concern is valued more than preserving consensus.

Rethink Best Practices

This is the idea from Grant’s book that resonated with me the most. He argues that the phrase “best practice” carries an implicit claim of finality: if something is already the best, there is nothing left to improve. The label discourages inquiry at the precise moment when inquiry might be most needed — when the world around the practice has changed but the practice itself has not.

In software engineering, the examples are everywhere. Teams that adopt a framework because it was the best choice five years ago and never revisit the decision. Deployment pipelines built around constraints that no longer exist. Coding conventions inherited from a project with entirely different requirements. The practice was sound once; the mistake is assuming it will be sound forever.

Grant’s suggestion is simple: replace “best practices” with “better practices” — a phrasing that keeps the door open. A better practice is one that works well right now, given what we know today, with the explicit understanding that tomorrow’s evidence may demand something different. This is not an invitation to chaos or endless refactoring; it is a commitment to periodic reassessment, a habit of asking does this still serve us? before the answer becomes painfully obvious.

The teams I have seen thrive tend to share this trait. They adopt conventions deliberately, revisit them periodically, and discard them without guilt when the context shifts. They treat their own processes with the same empirical rigour they bring to their code.


Intellectual humility, psychological safety, and the willingness to question established conventions are not three separate ideas. They reinforce each other: humility makes safety possible, safety makes questioning possible, and questioning keeps everything else honest. Together, they describe a culture that learns faster than it accumulates debt — technical or otherwise.

Grant’s book gave me a useful vocabulary for something I had felt but not articulated: that the best engineering teams are not the ones with the best answers, but the ones most willing to revisit them.