A strange phenomenon happens on our journey towards perceived expertise: we get stuck.
To figure out coaching, or well anything in life, we take the complexity that is life and break it down into a practical, and usable, one. Or as Philosopher Daniel Dennett in his book Intuition Pumps states, “Oversimplifications…cut through the hideous complexity with a working model that is almost right, postponing the messy details until later.”
As coaches, these models are abundantly clear at every step. Whether it’s using a physiological based model where we base our training paradigm on VO2max, Lactate threshold, and Running Economy, or a biomechanical model where we look at force production and efficiency, we all create a model for how the body functions. Breaking it down further, we create a model for how to approach training, whether it’s based on Lydiard’s base building or Frank Horwill’s 5-pace theory. It’s all a model to help us apply the complex concept that is manipulating someones physiology and psychology to run fast.As we grow as experts, the model we have in our head of how the world we are attempting to figure out works gets so ingrained that it becomes part of our belief system. You become a “Lydiard” guy, or a “Daniels disciple”, or choose your own belief system. The problem is that once we commit, we almost become married to it.
As we grow as experts, the model we have in our head of how the world we are attempting to figure out works gets so ingrained that it becomes part of our belief system. You become a “Lydiard” guy, or a “Daniels disciple”, or choose your own belief system. The problem is that once we commit, we almost become married to it.
Enter the world of cognitive dissonance. In psychology they have a nice term for what occurs when we have a mismatch between our model and the evidence that we see. Essentially, what occurs is that we have a horrible tendency to match any experience with our model, instead of adjusting the model to the evidence.In his new book
In his new book Black Box Thinking, Matthew Syed outlines the extremes people go to avoiding adjusting their model. From doctors to lawyers to academics, we’re all guilty of cognitive dissonance:
“when we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. we simply invent new reasons, new justifications, new explanations. sometimes we ignore the evidence altogether.”
We’re incredible self-justification machines, manipulating our experiences to ensure that they fit our world view. One of the reasons we go to such great lengths to manipulate our perceptions to fit the story is actually our Ego.
As we develop a model for how something works in our chosen field, we tend to tie ourselves to it. As a “Lydiard guy” you become invested in that model. So whenever someone comes along and says “Lydiard is dead wrong” it’s not simply an attack on Lydiard or his views, it’s an attack on you. After all, you tied your chain to the Lydiard wagon, so what the challenger is saying is that you’re wrong. The more were invested, the more we take it personally. If we’ve spent the last decade of our life learning and perfecting this system, and now evidence comes out it’s got flaws, it can be quite harrowing to admit that 10 years of our life was spent possibly in the wrong. We can’t handle that kind of internal struggle. So to come to peace with it, we simply justify. We re-frame the evidence to fit our narrative. It’s all to protect our ego.
So in a scary twist, what happens when we have our Ego challenged is that we actually dig in deeper. We double down that our view of the world is true. Think about any of the major arguments in the world today, whether it’s climate change, politics, religion, or health care, and you can see the impact of cognitive dissonance. When individuals are challenged, they double down on their beliefs. They go further into the corner, protecting their ‘ego’, and polarizing the argument even more.
It’s why no one ever budges on these complex topics.
Syed makes the argument that the more well known or high profile an individual is that the more likely he is to dig in. And that makes intuitive since, because they have more to lose, both externally in terms of public humiliation, but also internally in terms of having their ego of being some wise, expert completely challenged:
“It is those who are the most publicly associated with their predictions, whose livelihoods and egos are bound up with their expertise, who are most likely to reframe their mistakes- and who are thus the least likely to learn from them.”
In evaluating this idea, Syed looked at the field of Economics.
In economics, like coaching, young economists are taught a variety of viewpoints and theories of how their world works, bust they eventually settle into a few opposing camps. Just like the Lydiard, Daniels, and Coe views of distance training pre-dominate, in economics there are schools of thought, from Keynsian to Monetarists.
What’s interesting, and parallel to coaching, is that often these decisions of attaching yourself to one school of thought are done relatively early. In college, most economists hook their cart to one wagon and set forth on their journey. Stop and think about this for a moment. At an early developmental age, when they presumably have less information, they decide their school of thought. As Syed points out:
“The decision to join one group or another was often based on the flimsiest of pretexts, but it had remarkably long term consequences. Very few economists after their ideological stance. They stick to it for life…Fewer than 10 percent change “schools” during their careers, or ‘significantly adapt’ their theoretical assumptions…
It hints at the suspicion that the intellectual energy of some of the world’s most formidable thinkers is directed, not at creating new, richer, more explanatory theories, but at coming up with ever-more-tortious rationalizations as to why they were right all along.”
Let that sink in for a moment. Fewer than 10 percent change their schools of thoughts as they become vastly more knowledgable. As Syed rightly points out, we become stuck in our model and then spend the rest of our lives justifying it or hopefully refining it.
It’s the exact same way in coaching.
Through dumb luck and mostly based on what worked as athletes in our own young career, we pick a side and a model. Then we stick with it…forever.
Ability to change:
This post isn’t about why we’re all doomed and constantly fooling ourselves into believing nonsense, though we probably are, it’s about what to do about it.
While recognizing our internal bias is only half the battle, to me it’s about being open to be challenged. Instead of setting yourself up to react defensively, dig your heels in, and be prepared to defend your views until death, it’s about making sure you are open to challenge.
The best way to accomplish this in my experience is pretty simple
First, try to leave your ego at home. It’s always there, but try to not tie your sense of self-worth to what it is you do as a job or your beliefs. In my younger years, and even now, I can be quite blunt about ideas I feel strongly about. When I was speaking at a coaching conference, I essentially said “doing X is stupid.”It might not have been the most poetic way to make my point, but afterwards I had an angry coach dig into me, as if I’d personally attacked him. I had to step back and essentially remind him I wasn’t attacking him or who he was or his coaching, I said this one tiny component of his program might be worth looking at.
Second, explore counter viewpoints. Read books on training methods that are completely different. Instead of spending your time critiquing them and justifying why they don’t work to make yourself feel good, look at why they do work. Examine why someone might have performed well off of them.
Third, surround yourself by people who will keep your ego in check, remind you that you don’t know anything (ala Jon and I’s recent podcast discussion), and who you can have challenging dialogue with.
The great coaches are able to constantly challenge their model, creating a natural selection like check to insure that the model is continually adjusted and updated to what actually occurs. They take evidence, failure, and mistakes as a stimulus to improve their model, or perhaps throw it away and create a totally new one. They don’t get stuck in their belief system or tie their ego so tightly to their work that it prevents change.
It’s not about declaring yourself a Lydiard, Daniels, Canova, or whoever disciple. When we make such declarations, we tend to turn off learning. We become wed to the idea, spending the rest of our time justifying and defending it, instead of evaluating and refining.
As I said in my book, ” You can love an idea, but don’t be married to it. ”