top of page

Barriers, and the Strange Comfort of Standing Still

Updated: Jan 22


There is something deeply reassuring about a barrier.


It is visible. Concrete. Performative. You can point to it in a meeting and say, look — we acted. It carries the emotional weight of responsibility without the inconvenience of consequence. Barriers are the neatest way a system can convince itself it is doing the right thing, even as nothing moves.


Which is why we reach for them so instinctively.


Behavioural science gives us a fairly brutal explanation for this. Dan Ariely spent years showing how often we confuse feeling rational with being rational — particularly when our self-image is at stake. We want to see ourselves as careful, prudent, competent people. We want to avoid blame more than we want to create value. And so we gravitate towards decisions that are defensible rather than decisions that are effective.


Barriers are perfect for this. They reduce exposure. They narrow responsibility. They create a clean moral narrative: we prevented something bad. In a loss-averse world, prevention feels safer than enablement, even when enablement is the entire point of the system.


Rory Sutherland makes a complementary point from a different angle: organisations are obsessed with the things that are legible — the things that can be audited, documented, and justified — while systematically neglecting the things that actually change human behaviour. Barriers score very highly on legibility. They score very poorly on behavioural intelligence.


And yet, because they look serious, we mistake them for thought.


The problem isn’t that barriers are always wrong. It’s that they are emotionally efficient. Installing one feels like progress. Removing one feels like risk. That emotional asymmetry does most of the work. Once you see that, you start noticing barriers everywhere — not as considered interventions, but as reflexes. Default responses to uncertainty. Physical or procedural manifestations of institutional anxiety.


What gets lost is the far harder question: what behaviour are we actually trying to shape here?


Most barriers don’t answer that. They simply shout “don’t” at everyone equally. They punish the careful along with the careless. They flatten context. They assume misuse rather than encouraging good use. Behaviourally speaking, that’s a very blunt instrument.


And blunt instruments have side effects.


One of those side effects is disengagement. People are exquisitely sensitive to what a system implies about them. A barrier doesn’t just block an action; it communicates mistrust. It says, we don’t believe you’ll behave well unless we stop you. Ariely’s work shows that perceived fairness and intent matter enormously. When people feel distrusted by default, they don’t respond by becoming model citizens. They withdraw. They game the system. Or they simply stop caring.


This is where barriers quietly become expensive.


Not expensive in a budget sense — expensive in momentum. In goodwill. In the willingness of people to keep showing up and trying. Because barriers rarely arrive alone. They accumulate. Each one is reasonable. Each one is someone else’s responsibility. But together they create sludge: the cumulative friction that turns straightforward progress into an endurance test.


Here’s the critical asymmetry: the people who create barriers almost never experience them end-to-end. They encounter them as isolated decisions inside a silo. The person trying to get something done experiences them as a single, unbroken wall. The system thinks it is functioning. The participant feels stalled, exhausted, and increasingly cynical.


That’s how places stop moving without ever officially deciding to stop.


Momentum, in this context, is not a delivery metric. It’s a psychological signal. In systems where things visibly happen — even imperfectly — people are surprisingly forgiving. They assume good intent. They offer solutions. They tolerate friction because it feels temporary. In systems where nothing seems to move, every delay feels loaded. Every barrier starts to look political, even when it isn’t. Trust erodes long before any formal failure occurs.


This is why barrier culture is so corrosive in communities. It doesn’t just slow projects down; it trains everyone to see each other as obstacles. Institutions begin to view people as risks to be managed. People begin to view institutions as blockers to be worked around. At that point, behaviour degrades on all sides, and the barrier — originally introduced to prevent harm — ends up creating a far deeper one.


The tragedy is that many of these barriers are not deep truths. They’re not immovable laws or genuine constraints. They’re inherited defaults. Half-solutions introduced under pressure. “Best practice” that’s never been interrogated. They persist because removing a barrier feels like adding risk, while adding one feels like managing it. Predictably irrational, again.


So the question isn’t simply how to remove barriers. It’s how to stop needing them in the first place.


The answer almost always lies earlier than we think. Earlier conversations. Earlier alignment. Earlier involvement. Barriers tend to appear where dialogue arrived too late. When people are brought in at the end, positions are fixed, incentives are misaligned, and constraint feels like the only remaining tool. At that point, the barrier isn’t a choice; it’s a symptom of bad sequencing.


Behavioural insight points to a different move: bring people into the room before certainty hardens. When people understand constraints, they adapt. When they feel ownership, they self-regulate. When they are treated as partners rather than problems, behaviour improves without enforcement. This isn’t idealism; it’s observable human behaviour.


Sutherland’s reframing matters here. If a system defines success as preventing bad things, barriers will always win. Prevention is easier to defend than enablement. But if success is defined as making good things easier to do, the toolkit changes. You start designing pathways rather than fences. Defaults rather than prohibitions. Signals, incentives, and social norms instead of blunt control.


Pace becomes part of the strategy, not an afterthought. Speed communicates intent. It tells people their effort matters and their energy won’t be wasted. Slow systems, however well-meaning, communicate the opposite. They teach people that nothing happens unless it is forced — and eventually, people stop pushing.


None of this argues for recklessness. Some constraints are necessary. Some risks are real. But a mature system understands that barriers are not free. They carry a tax — on trust, on momentum, on participation — and that tax compounds over time.


The strategic shift is simple, but uncomfortable. Stop treating barriers as evidence of responsibility, and start treating them as a cost that must be justified. Ask not only what a barrier prevents, but what it quietly destroys. Make it easier to route around problems than to block them. Design for good-faith behaviour instead of assuming misuse. And above all, optimise for movement rather than alibis.


Because the real danger isn’t that something goes wrong.


It’s that you build a system so good at protecting itself that it forgets how to move at all — and calls that maturity.


Further reading


If this line of thinking resonates, these are the writers who have most shaped how I think about behaviour, systems, and the gap between what we intend to design and how people actually respond.


Dan Ariely


  • Predictably Irrational


A foundational text on why humans consistently make decisions that contradict classical rational models — and how systems repeatedly misinterpret that behaviour. Particularly useful for understanding why “sensible” interventions often backfire.


  • The Upside of Irrationality


A follow-on that explores motivation, meaning, and effort — essential reading for anyone designing processes that rely on goodwill, participation, or voluntary engagement.


Daniel Kahneman


  • Thinking, Fast and Slow


A landmark work on the two systems that govern human thought: fast, intuitive, emotional judgment versus slow, deliberate reasoning. Essential for understanding why systems that assume careful, rational decision-making so often fail in the real world.


Rory Sutherland


  • Alchemy: The Dark Art and Curious Science of Creating Magic in Brands, Business, and Life


A sharp, practical counterweight to over-rational, metric-obsessed thinking. Brilliant on reframing problems, understanding perceived value, and recognising when psychological solutions outperform structural ones.


Stuart Sutherland


  • Irrationality: The Enemy Within


A rigorous and often uncomfortable examination of the cognitive errors, self-deceptions, and flawed reasoning that underpin everyday decision-making. Less optimistic than Ariely, and arguably more forensic, it is a crucial reminder that irrationality is not an exception but a baseline.


Richard Thaler & Cass Sunstein


  • Nudge


The book that popularised choice architecture. Less about barriers, more about designing environments where better behaviour becomes the default rather than the exception.


Gerd Gigerenzer


  • Risk Savvy


A useful corrective on how institutions misunderstand risk, overcompensate with control, and unintentionally reduce people’s ability to make good decisions.




RW


Comments


bottom of page