top of page

Complexity - seeing more of our strategic space

Have you ever been in a nasty accident or a near-miss situation? If so, you might have had that experience of your memory of the incident being in slow-motion. It seems so strange - why does this happen??


It's all to do with the pupil of your eye. When we are in the dark or conditions of low light, our pupil widens to allow more light in and therefore see more of what is there. So in an unexpected scary moment, our pupils widen to take in as much information as they can, in the hope that it might provide some assistance to the brain in working out how to keep us safe - remembering that the brain's primary function is survival. Interestingly, our pupils also widen when we are surprised, afraid, consciously processing something or attracted to someone!


The diagram below is something I drew on paper for my designer who came back to me with the little eyelashes included! I was intending the metaphor of 'seeing' and she obviously picked up on that, but it wasn't until I came across the parallel idea of expanding pupils that I realised how perfectly this eye diagram depicts what I was trying to communicate.


When we are on auto-pilot, we tend to see as if we are looking through a paper towel tube rather than a wide-angle camera lens. We see what is in front of us, we see what we expect to see, we see what we think is important, and we see what makes sense according to our existing mental models and past experiences. In other words, we don't see all of what is there - we see our own version of reality.


Unless we are deliberately looking for it, we don't see the full complexity of the issues we are working with, we don't see wider systems, and we don't see the unexpected.


There are a few fancy names for things that explain these phenomenon if you're interested:

  • Confirmation bias - we see what we expect to see and avoid seeing what we don't want to expect to see.

  • Availability bias - we place more weight and importance on what is most easily retrieved from memory.

  • Familiarity bias - we place more weight and importance on what is most familiar (think voting in elections for people just because we know them, and selecting investments based on whether we have heard of them).

  • Recency bias - we see and remember what has happened most recently.

  • Selective attention or inattentional blindness - we see what we set out to see and are blind to even the most obvious, right-there-in-your-face, visual cues.

I don't know about you, but I find biases like these SCARY when it comes to strategic thinking. Why? Because they are totally human, we can't escape them and everyone is susceptible. Because they undermine any feeling of certainty about what we know that we might be tempted to harbour. And because they are totally invisible to us unless we actively go looking for them, and even then they can be hard to spot.


Here we are, making decisions for our organisations that might have huge costs and ramifications, and we can't even trust our own perception of our strategic space??


I don't mean to be all doom and gloom, but this is why the capability of managing complexity is so fundamental to strategic thinking, and it starts with something unexpected - intellectual humility.


Seeing more of what is really there is never going to happen if we can't get past the belief that what we see is what is really there. With intellectual humility, we are in a place to listen, reconsider, explore, and generally override our sense of being right. And that can be incredibly empowering, at the same time as humbling.


If you liked this, sign up for my articles straight to your inbox, check out my website, follow me on LinkedIn, or contact me on nina@ninafield.co.nz to discuss how I can help you with strategic thinking and strategic leadership development.

Comments


bottom of page