I remember the exact moment I thought I had analytics figured out.
It was about three years into my career, and I had just wrapped up what I believed was a brilliant analysis that cracked a complex marketing attribution problem for a client in Vietnam. I presented my findings with the confidence of someone who had “arrived.” The regression coefficients were significant, the model fit looked perfect, and I had an answer for every question thrown my way.
Looking back now — three decades into this field– I cannot help but cringe at that younger version of myself. Not because my analysis was wrong, but because I was so convinced that I was completely right.
That was my first real encounter with the Dunning–Kruger effect, though I didn’t know it at the time.
Anyone who has spent enough years in analytics or data science has probably lived through it. It is that intoxicating early phase when the concepts finally click — when running a regression equation feels like unlocking the universe, when writing SQL queries makes you feel like a magician of databases, when building a few models tricks you into thinking you have mastered the science and art of deploying machine learning algorithms.
The danger is not incompetence. In fact, you are quite capable at this stage. The danger is that you do not yet know what you do not know — and that ignorance can masquerade as mastery. It is a subtle kind of arrogance that can quietly derail both your learning and your leadership.
Then the shift happens — slowly, and then suddenly.
More complex problems come along that your standard toolkit cannot solve. Data sets start breaking your assumptions. You realize that statistical significance does not always mean real-world importance. Correlation turns out not to be causation (again), and you begin to see that the world resists being neatly modeled.
For me, the turning point came when a senior colleague asked a deceptively simple question about my analysis:
“What would have to be true for your conclusion to be wrong?”
I froze. I had no answer — because I had never even considered that I might be wrong. That question haunted me for weeks. And over time, it reshaped the way I approached every piece of analysis, every business problem, and eventually, every leadership decision.
This journey — from overconfidence to humility — is not just about technical growth, I realized.
It is about leadership.
The best leaders I have met are the ones who have walked far enough down the road of expertise to recognize its limits. They ask better questions than they give answers. They create space for their teams to explore, to fail, and to learn because they remember what it felt like to mistake confidence for competence. They understand that saying “I don’t know” is not weakness; it is the foundation of wisdom.
When a junior analyst comes to them with that same gleam of certainty I once had, they do not crush it. They nurture it — gently steering it toward the kinds of questions that lead to deeper understanding.
After three decades of working with data, models, and people, I have made peace with uncertainty.
I know some things well. I know many things partially. And I know there is an infinite amount I will never fully grasp. (Quantum science and quantum computing? Hahaha.)
That is not a limitation — it is a kind of liberation.
Every day still offers a chance to learn something that challenges what I thought I knew. The smugness of my younger self has been replaced by something far more valuable: The humility to keep growing, questioning, and learning.
And that, I have realized learned, is what real expertise — and real leadership — actually look like. And should be.


Leave a comment