ARTICLES


When experience is not enough, what project out comes?




One of the most common assumptions in complex engineering projects is that experience guarantees success. In reality, however, many delays are not caused by a lack of technical competence; they are shaped by how humans think, communicate, and perform under pressure.

From a psychological and neuroscientific perspective, projects are not only technical systems, they are human systems. And human systems are inherently imperfect, and deeply influenced by the environments in which they operate.


Research in neuroscience and organizational behavior helps explain why even highly experienced teams can struggle.

One of the most well-known cognitive biases is the planning fallacy: our tendency to underestimate time, risks, and complexity, even when past experience suggests otherwise.


The brain naturally focuses on the ideal scenario, not the realistic one. This tendency is often reinforced by overconfidence, particularly in experienced teams, where past success creates the belief that "this time will be different."

At the same time, as demonstrated by Daniel Kahneman's work, the brain under pressure shifts toward faster, automatic thinking, relying on patterns and assumptions rather than deep analysis.


In parallel, what David Rock describes in the SCARF model highlights how perceived threats in the work environment, such as uncertainty, loss of status, or lack of control, can significantly reduce cognitive performance.

In these conditions, the brain is not operating at its full capacity; it is protecting itself.


As a result, even highly capable professionals may: focus only on immediate tasks, follow familiar patterns without questioning them, avoid raising concerns, or fail to fully anticipate downstream inpacts, not because they lack expertise, but because the environment does not support their best thinking.

Amy Edmondson's research on psychological safety further reinforces this dynamic.


When people do not feel safe to speak up, small issues remain unspoken. Not because they are not seen, but because they are not shared.


Over time, these small, unaddressed issues accumulate, and they tend to surface when systems are tested as a whole, often during Commissioning.

This is where a critical misconception becomes visible.


Commissioning is often treated as a phase of discovery, a moment when issues are expected to appear; but in reality, Commissioning should not be a "black box", it should not be a phase of uncertainty, it should be a phase of confirmation.


A moment when the project demonstrates, with clarity and confidence, that everything works as intended, where what was designed, built, and documented comes together as a functioning system, this requires more than technical excellence.


It requires environments that enable people to think clearly, challenge assumptions, and anticipate consequences across phases, and it also requires a fundamental shift in how projects are structured.


Concept Design, Construction, and Commissioning cannot operate as isolated phases, they must function as an integrated system.


When Commissioning perspectives are introduced earlier in the project lifecycle, even during Construction, teams gain visibility of future operational realities; potential issues are identified sooner, assumptions are challenged earlier, and decisions are made with the full lifecycle in mind.


This is not about adding complexity, it is about reducing uncertainty, because in complex environments, the earlier a problem is seen, the easier it is to solve, and the later it appears, the more expensive it becomes.


Successful projects, therefore, are not defined only by technical capability; they are defined by alignment between phases, between teams, between expectations and reality.


They are defined by how well human potential is enabled, or limited, by the environment irt which teams operate, because Commissioning should never be a surprise, it should be the evidence that the project was done right. 




Insight by Anne Vieira

Business Human Strategy