What is Transformation?

Transformation

It has been another interesting week.  A bitter-sweet mixture of disappointment and delight. And the central theme has been ‘transformation’.


The source of disappointment was the newsreel images of picket lines of banner-waving junior doctors standing in the cold watching ambulances deliver emergencies to hospitals now run by consultants.

So what about the thousands of elective appointments and operations that were cancelled to release the consultants? If the NHS was failing elective delivery time targets before it is going to be failing them even more now. And who will pay for the “waiting list initiatives” needed to just catch up? Depressing to watch.

The mercurial Roy Lilley summed up the general mood very well in his newsletter on Thursday, the day after the strike.

Roy_Lilley_Transformation

What he is saying is we do not have a health care system, we have a sick care system.  Which is the term coined by the acclaimed systems thinker, the late Russell Ackoff (see the video about half way down).

We aspire to a transformation-to-better but we only appear to be able to achieve a transformation-to-worse. That is depressing.


My source of delight was sharing the stories of those who are stepping up and are transforming themselves and their bits of the world; and how they are doing that by helping each other to learn “how to do it” – a small bite at a time.

Here is one excellent example: a diagnostic study looking at the root cause of the waiting time for school-age pupils to receive a health-protecting immunisation.


So what sort of transformation does the NHS need?

A transformation in the way it delivers care by elimination of the fragmentation that is the primary cause of the distrust, queues, waits, frustration, chaos and ever-increasing costs?

A transformation from purposeless and reactive; to purposeful and proactive?

A transformation from the disappointment that flows from the mismatch between intent and impact; to the delight that flows from discovering that there is a way forward; that there is a well understood science that underpins it; and a growing body of evidence that proves its effectiveness.  The Science of Improvement.


In  a recent blog I shared the story of how it is possible to ‘melt queues‘ or more specifically how it is possible to teach anyone, who wants to learn, how to melt queues.

It is possible to do this for an outpatient clinic in one day.

So imagine what could happen if just 1% of consultants decided improve their outpatient clinics using this quick-and-easy-to-learn-and-apply method?  Those courageous and innovative consultants who are not prepared to drown in the  Victim Vortex of despair and cynicism.  And what could happen if they shared their improvement stories with their less optimistic colleagues?  And what could happen if a just a few of them followed the lead of the innovators?

Would that be a small transformation?  Or the start of a much bigger one? Or both?

Undiscussables

Chimp_NoHear_NoSee_NoSpeakLast week I shared a link to Dr Don Berwick’s thought provoking presentation at the Healthcare Safety Congress in Sweden.

Near the end of the talk Don recommended six books, and I was reassured that I already had read three of them. Naturally, I was curious to read the other three.

One of the unfamiliar books was “Overcoming Organizational Defenses” by the late Chris Argyris, a professor at Harvard.  I confess that I have tried to read some of his books before, but found them rather difficult to understand.  So I was intrigued that Don was recommending it as an ‘easy read’.  Maybe I am more of a dimwit that I previously believed!  So fear of failure took over my inner-chimp and I prevaricated. I flipped into denial. Who would willingly want to discover the true depth of their dimwittedness!


Later in the week, I was forwarded a copy of a recently published paper that was on a topic closely related to a key thread in Dr Don’s presentation:

understanding variation.

The paper was by researchers who had looked at the Board reports of 30 randomly selected NHS Trusts to examine how information on safety and quality was being shared and used.  They were looking for evidence that the Trust Boards understood the importance of variation and the need to separate ‘signal’ from ‘noise’ before making decisions on actions to improve safety and quality performance.  This was a point Don had stressed too, so there was a link.

The randomly selected Trust Board reports contained 1488 charts, of which only 88 demonstrated the contribution of chance effects (i.e. noise). Of these, 72 showed the Shewhart-style control charts that Don demonstrated. And of these, only 8 stated how the control limits were constructed (which is an essential requirement for the chart to be meaningful and useful).

That is a validity yield of 8 out of 1488, or 0.54%, which is for all practical purposes zero. Oh dear!


This chance combination of apparently independent events got me thinking.

Q1: What is the reason that NHS Trust Boards do not use these signal-and-noise separation techniques when it has been demonstrated, for at least 12 years to my knowledge, that they are very effective for facilitating improvement in healthcare? (e.g. Improving Healthcare with Control Charts by Raymond G. Carey was published in 2003).

Q2: Is there some form of “organizational defense” system in place that prevents NHS Trust Boards from learning useful ‘new’ knowledge?


So I surfed the Web to learn more about Chris Argyris and to explore in greater depth his concept of Single Loop and Double Loop learning.  I was feeling like a dimwit again because to me it is not a very descriptive title!  I suspect it is not to many others too.

I sensed that I needed to translate the concept into the language of healthcare and this is what emerged.

Single Loop learning is like treating the symptoms and ignoring the disease.

Double Loop learning is diagnosing the underlying disease and treating that.


So what are the symptoms?
The pain of NHS Trust  failure on all dimensions – safety, delivery, quality and productivity (i.e. affordability for a not-for-profit enterprise).

And what are the signs?
The tell-tale sign is more subtle. It’s what is not present that is important. A serious omission. The missing bits are valid time-series charts in the Trust Board reports that show clearly what is signal and what is noise. This diagnosis is critical because the strategies for addressing them are quite different – as Julian Simcox eloquently describes in his latest essay.  If we get this wrong and we act on our unwise decision, then we stand a very high chance of making the problem worse, and demoralizing ourselves and our whole workforce in the process! Does that sound familiar?

And what is the disease?
Undiscussables.  Emotive subjects that are too taboo to table in the Board Room.  And the issue of what is discussable is one of the undiscussables so we have a self-sustaining system.  Anyone who attempts to discuss an undiscussable is breaking an unspoken social code.  Another undiscussable is behaviour, and our social code is that we must not upset anyone so we cannot discuss ‘difficult’ issues.  But by avoiding the issue (the undiscussable disease) we fail to address the root cause and end up upsetting everyone.  We achieve exactly what we are striving to avoid, which is the technical definition of incompetence.  And Chris Argyris labelled this as ‘skilled incompetence’.


Does an apparent lack of awareness of what is already possible fully explain why NHS Trust Boards do not use the tried-and-tested tool called a system behaviour chart to help them diagnose, design and deliver effective improvements in safety, flow, quality and productivity?

Or are there other forces at play as well?

Some deeper undiscussables perhaps?

System of Profound Knowledge

 

Don_Berwick_2016

This week I had the great pleasure of watching Dr Don Berwick sharing the story of his own ‘near religious experience‘ and his conversion to a belief that a Science of Improvement exists.  In 1986, Don attended one of W.Edwards Deming’s famous 4-day workshops.  It was an emotional roller coaster ride for Don! See here for a link to the whole video … it is worth watching all of it … the best bit is at the end.


Don outlines Deming’s System of Profound Knowledge (SoPK) and explores each part in turn. Here is a summary of SoPK from the Deming website.

Deming_SOPK

W.Edwards Deming was a physicist and statistician by training and his deep understanding of variation and appreciation for a system flows from that.  He was not trained as a biologist, psychologist or educationalist and those parts of the SoPK appear to have emerged later.

Here are the summaries of these parts – psychology first …

Deming_SOPK_Psychology

Neurobiologists and psychologists now know that we are the product of our experiences and our learning. What we think consciously is just the emergent tip of a much bigger cognitive iceberg. Most of what is happening is operating out of awareness. It is unconscious.  Our outward behaviour is just a visible manifestation of deeply ingrained values and beliefs that we have learned – and reinforced over and over again.  Our conscious thoughts are emergent effects.


So how do we learn?  How do we accumulate these values and beliefs?

This is the summary of Deming’s Theory of Knowledge …

Deming_SOPK_PDSA

But to a biologist, neuroanatomist, neurophysiologist, doctor, system designer and improvement coach … this does not feel correct.

At the most fundamental biological level we do not learn by starting with a theory; we start with a sensory.  The simplest element of the animal learning system – the nervous system – is called a reflex arc.

Sensor_Processor_EffectorFirst, we have some form of sensor to gather data from the outside world. Eyes, ears, smell, taste, touch, temperature, pain and so on.  Let us consider pain.

That signal is transmitted via a sensory nerve to the processor, the grey matter in this diagram, where it is filtered, modified, combined with other data, filtered again and a binary output generated. Act or Not.

If the decision is ‘Act’ then this signal is transmitted by a motor nerve to an effector, in this case a muscle, which results in an action.  The muscle twitches or contracts and that modifies the outside world – we pull away from the source of pain.  It is a harm avoidance design. Damage-limitation. Self-preservation.

Another example of this sensor-processor-effector design template is a knee-jerk reflex, so-named because if we tap the tendon just below the knee we can elicit a reflex contraction of the thigh muscle.  It is actually part of a very complicated, dynamic, musculoskeletal stability cybernetic control system that allows us to stand, walk and run … with almost no conscious effort … and no conscious awareness of how we are doing it.

But we are not born able to walk. As youngsters we do not start with a theory of how to walk from which we formulate a plan. We see others do it and we attempt to emulate them. And we fail repeatedly. Waaaaaaah! But we learn.


Human learning starts with study. We then process the sensory data using our internal mental model – our rhetoric; we then decide on an action based on our ‘current theory’; and then we act – on the external world; and then we observe the effect.  And if we sense a difference between our expectation and our experience then that triggers an ‘adjustment’ of our internal model – so next time we may do better because our rhetoric and the reality are more in sync.

The biological sequence is Study-Adjust-Plan-Do-Study-Adjust-Plan-Do and so on, until we have achieved our goal; or until we give up trying to learn.


So where does psychology come in?

Well, sometimes there is a bigger mismatch between our rhetoric and our reality. The world does not behave as we expect and predict. And if the mismatch is too great then we are left with feelings of confusion, disappointment, frustration and fear.  (PS. That is our unconscious mind telling us that there is a big rhetoric-reality mismatch).

We can see the projection of this inner conflict on the face of a child trying to learn to walk.  They screw up their faces in conscious effort, and they fall over, and they hurt themselves and they cry.  But they do not want us to do it for them … they want to learn to do it for themselves. Clumsily at first but better with practice. They get up and try again … and again … learning on each iteration.

Study-Adjust-Plan-Do over and over again.


There is another way to avoid the continual disappointment, frustration and anxiety of learning.  We can distort our sensation of external reality to better fit with our internal rhetoric.  When we do that the inner conflict goes away.

We learn how to tamper with our sensory filters until what we perceive is what we believe. Inner calm is restored (while outer chaos remains or increases). We learn the psychological defense tactics of denial and blame.  And we practice them until they are second-nature. Unconscious habitual reflexes. We build a reality-distortion-system (RDS) and it has a name – the Ladder of Inference.


And then one day, just by chance, somebody or something bypasses our RDS … and that is the experience that Don Berwick describes.

Don went to a 4-day workshop to hear the wisdom of W.Edwards Deming first hand … and he was forced by the reality he saw to adjust his inner model of the how the world works. His rhetoric.  It was a stormy transition!

The last part of his story is the most revealing.  It exposes that his unconscious mind got there first … and it was his conscious mind that needed to catch up.

Study-(Adjust)-Plan-Do … over-and-over again.


In Don’s presentation he suggests that Frederick W. Taylor is the architect of the failure of modern management. This is a commonly held belief, and everyone is equally entitled to an opinion, that is a definition of mutual respect.

But before forming an individual opinion on such a fundamental belief we should study the raw evidence. The words written by the person who wrote them not just the words written by those who filtered the reality through their own perceptual lenses.  Which we all do.

Health Care System Engineers

engineers_turbine_engine_16758The NHS is falling.

All the performance indicators on the NHSE cockpit dashboard show that it is on a downward trajectory.

The NHS engines are no longer effective enough or efficient enough to keep the NHS airship safely aloft.

And many sense the impending crash.

Scuffles are breaking out in the cockpit as scared pilots and anxious politicians wrestle with each other for the controls. The passengers and patients appear to be blissfully ignorant of the cockpit conflict.

But the cockpit chaos only serves to accelerate their collective rate of descent towards the hard reality of the Mountain of Doom.


So what is needed to avoid the crash?

Well, some calm and credible leadership in the cockpit would help; some coordinated crash avoidance would help too; and some much more effective and efficient engines to halt the descent and to lift us back to a safe altitude would help too. In fact the new NHS engines are essential.

But who is able to design, build, test and install these new health care system engines?


We need competent and experienced health care system engineers.


And clearly we do not have enough because if we had, we would not be in a CFIT scenario (cee fit = controlled flight into terrain).

So why do we not have enough health care system engineers?

Surely there are appropriate candidates and surely there are enough accredited courses with proven track records?

I looked.  There are no accredited courses in the UK and there are no proven track records. But there appears to be no shortage of suitable candidates from all corners of the NHS.

How can this be?

The answer seems to be that the complex flow system engineering science needed to do this is actually quite new … it is called Complex Adaptive Systems Engineering (CASE) … and it has not diffused into healthcare.

More worryingly, even basic flow engineering science has not either, and that seems to be because health care is so insular.

So what can we do?

The answer would seem to be clear.  First, we need to find some people who, by chance, are dual-trained in health care and systems engineering.  And there are a few of them, but not many.


People like Dr Kate Silvester who trained as an ophthalmic surgeon then retrained as a manufacturing systems engineer with Lucas and Airbus. Kate brought these novel flow engineering skills back in to the NHS in the days of the Modernisation Agency and since then has proved that they work in practice – as described in the Health Foundation Flow-Cost-Quality Programme Report.


Second, we need to ask this small band of seasoned practitioners to design and to deliver a pragmatic, hands-on, learning-by-doing Health Care Systems Engineer Development Programme.


The good news is that, not surprisingly, they have already diagnosed this skill gap and have been quietly designing, building and testing.

And they have come up with a name: The Phoenix Programme.

And because TPP is a highly disruptive innovation they know that it is too early to give it a price-tag, so they have generously offered a limited number of free tickets to the first part of TPP to clinicians and clinical scientists.

The first step is called the Foundations of Improvement Science in Healthcare online course, and better known to those who have completed it as “FISH”.

This vanguard of innovators have shared their feedback.

And, for those who are frustrated and curious enough to explore outside their comfort zones, there are still some #freeFISH tickets available.


So, if you are attracted by the opportunity of dual-training as a clinician and as a Health Care Systems Engineer (HCSE) then we invite you to step this way.


And not surprisingly, this is not a new idea … see here and here.

Culture – cause or effect?

The Harvard Business Review is worth reading because many of its articles challenge deeply held assumptions, and then back up the challenge with the pragmatic experience of those who have succeeded to overcome the limiting beliefs.

So the heading on the April 2016 copy that awaited me on my return from an Easter break caught my eye: YOU CAN’T FIX CULTURE.


 

HBR_April_2016

The successful leaders of major corporate transformations are agreed … the cultural change follows the technical change … and then the emergent culture sustains the improvement.

The examples presented include the Ford Motor Company, Delta Airlines, Novartis – so these are not corporate small fry!

The evidence suggests that the belief of “we cannot improve until the culture changes” is the mantra of failure of both leadership and management.


A health care system is characterised by a culture of risk avoidance. And for good reason. It is all too easy to harm while trying to heal!  Primum non nocere is a core tenet – first do no harm.

But, change and improvement implies taking risks – and those leaders of successful transformation know that the bigger risk by far is to become paralysed by fear and to do nothing.  Continual learning from many small successes and many small failures is preferable to crisis learning after a catastrophic failure!

The UK healthcare system is in a state of chronic chaos.  The evidence is there for anyone willing to look.  And waiting for the NHS culture to change, or pushing for culture change first appears to be a guaranteed recipe for further failure.

The HBR article suggests that it is better to stay focussed; to work within our circles of control and influence; to learn from others where knowledge is known, and where it is not – to use small, controlled experiments to explore new ground.


And I know this works because I have done it and I have seen it work.  Just by focussing on what is important to every member on the team; focussing on fixing what we could fix; not expecting or waiting for outside help; gathering and sharing the feedback from patients on a continuous basis; and maintaining patient and team safety while learning and experimenting … we have created a micro-culture of high safety, high efficiency, high trust and high productivity.  And we have shared the evidence via JOIS.

The micro-culture required to maintain the safety, flow, quality and productivity improvements emerged and evolved along with the improvements.

It was part of the effect, not the cause.


So the concept of ‘fix the system design flaws and the continual improvement culture will emerge’ seems to work at macro-system and at micro-system levels.

We just need to learn how to diagnose and treat healthcare system design flaws. And that is known knowledge.

So what is the next excuse?  Too busy?