Life or Death Decisions

The Improvement Science blog this week is kindly provided by Julian Simcox and Terry Weight.

What can surgeons learn from other professions about making life or death decisions?

http://www.bbc.co.uk/news/health-21862527

Dr Kevin Fong is on a mission to find out what can be done to reduce the number of mistakes being made by surgeons in the operating theatre.

He starts out with an example of a mistake in an operation that involved a problematic tracheotomy and subsequently, despite there being plenty of extra expert advice on hand, sadly the patient died. Crucially, a nurse had been ignored who if listened to might have provided the solution that could have saved the patient’s life.

Whilst looking at other walks of life – this example is used to explore how under similar pressures such mistakes can be avoided. For example, in aviation and in fire-fighting more robust and resilient cultures and systems have evolved – but how?

The Horizon editors highlight the importance of six things and we make some comments:

1. The aviation industry continually designs out hazards and risk.

Aviation was once a very hazardous pursuit. Nowadays the trip to the airport is much riskier than the flight itself, because over the decades aviators have learned how to learn-from-mistakes and to reduce future incidents. They have learned that blaming individuals for systemic failure gets in the way of accumulating the system-wide knowledge that makes the most difference.

Peter Jordan reminds us that in the official report into the 1989 Kegworth air disaster: 31 recommendations for improved safety were made – mainly to do with patient safety during crashes – an even then the report could not resist pointing the finger at the two pilots who, when confronted with a blow-out in one of their two engines, had wrongly interpreted a variety of signals and talked themselves into switching off the wrong engine. On publication of the report they were summarily dismissed, but much later successfully claimed damages for unfair dismissal.

http://en.wikipedia.org/wiki/Kegworth_air_disaster

2. Checklists can make a difference if the Team is engaged

The programme then refers to recent research by the World Health Organisation on the use of checklists that when implemented showed a large (35%) reduction in surgical complications across a range of countries and hospitals.

In University College Hospital London we see checklists being used by the clinical team to powerful effect. The specific example given concerns the process of patient hand-over after an operation from the surgical team to the intensive care unit. Previously this process had been ill-defined and done differently by lots of people – and had not been properly overseen by anyone.

No reference is made however to the visual display of data that helps teams see the effect of their actions on their system over time, and there is no mention of whether the checklists have been designed by outsiders or by the team themselves.

In our experience these things make a critical difference to ongoing levels of engagement – and to outcomes – especially in the NHS where checklists have historically been used more as a way of ensuring compliance with standards and targets imposed from the top. Too often checklists are felt to be instruments of persecution and are therefore fiercely (and justifiably) resisted.

We see plenty of scope in the NHS for clarifying and tightening process definitions, but checklists are only one way of prompting this. Our concern is that checklists could easily become a flavour-of-the-month thing – seen as one more edict from above. And all-too-quickly becoming yet another layer of the tick-box bureaucracy, of the kind that most people say they want to get away from.

We also see many potentially powerful ideas flowing form the top of the NHS, raining down on a system that has become moribund – wearied by one disempowering change initiative after another.

3. Focussing on the team and the process – instead of the hierarchy – enhances cooperation and reduces deferential behaviour.

Learning from the Formula One Pit Stop Team processes, UCH we are told have flattened their hierarchy ensuring that at each stage of the process there is clear leadership, and well understood roles to perform. After studying their process they have realised that most of the focus had previously been on only the technically demanding work rather than on the sequence of steps and the need for ensuring clear communication between each one of those steps. We are told that flattening the hierarchy in order to prioritise team working has also helped – deference to seniority (e.g. nurses to doctors) is now seen as obstructing safer practice.

Achieving role clarity goes hand-in-hand with simplification of the system – which all starts with careful process definition undertaken collaboratively by the team as a whole. In the featured operation every individual appears to know their role and the importance of keeping things simple and consistent. In our experience this is all the more powerful when the team agree to standardise procedures as soon as any new way has been shown to be more effective.

4. Situational Awareness is an inherent human frailty.

We see how fire officers are specifically trained to deal with situations that require both a narrow focus and an ability to stand back and connect to the whole – a skill which for most people does not come naturally. Under pressure we each too often fail to appreciate either the context or the bigger picture, losing situational awareness and constraining our span of attention.

In the aviation industry we see how pilot training is nowadays considered critically important to outcomes and to the reductions of pilot error in emergencies. Flight simulators and scenario simulation now play a vital role, and this is becoming more commonplace in senior doctor training.

It seems common sense that people being trained should experience the real system whilst being able to making mistakes. Learning comes from experimentation (P-D-C-A). In potentially life-and-death situations simulation allows the learning and the building of needed experience to be done safely off-line. Nowadays, new systems containing multiple processes and lots of people can be designed using computer simulations, but these skills are as yet in short supply in the NHS.

http://www.saasoft.com/6Mdesign/index.php

5. Understand the psychology of how people respond to their mistakes.

Using some demonstrations using playing cards, we see how people who have a non-reactive attitude to mistakes respond better to making them and are then less likely to make the same mistake again. Conversely some individuals seem to be less resilient – we would say becoming unstable – taking longer to correct their mistakes and subsequently making more of them. Recruitment of doctors is now starting to include the use of simulators to test for this psychological ability.

6. Innovation more easily flows from systems that are stable.

Due to a bird strike a few minutes after take-off, stopping both engines, an aircraft in 2008 was forced to crash land. The landing – in to New York’s Hudson River – was an innovative novel manoeuvre, and incredibly led to the survival of all the passengers and crew. An innovation that was safely executed by the pilot who in the moment kept his cool by sticking to the procedures and checklists he had been trained in.

This capability we are told had been acquired over more than three decades by the pilot Captain “Sully” Sullenberger, who sees himself as part of an industry that over time institutionalises emerging knowledge. He tells us that he had faith in the robustness and resilience of this knowledge that had accumulated by using the lessons from the past to build a safer future. He suggests it would be immoral not to learn from historical experience. To him it was “this robustness that made it possible to innovate when the unknown occurred”.

Standardisation often spawns innovation – something which for many people remains a counter-intuitive notion.

Sullenberger was subsequently lauded as a hero, but he himself tells us that he merely stuck to the checklist procedures and that this helped him to keep his cool whilst realising he needed to think outside the box.

The programme signs off with the message that human error is always going to be with us, and that it is how we deal with human error that really matters. In aviation there is a continual search for progress, rather than someone to blame. By accepting our psychological fallibility we give ourselves – in the moment – the best possible chance.

The programme attempts to balance the actions of the individual with collective action over time to design and build a better system – one in which all individuals can play their part well. Some viewers may have ended up remembering most the importance of the “heroic” individual. In our view more emphasis could have placed on the design of the system as a whole – such that it more easily maintains its stability without needing to rely either on the heroic acts of any one individual or on finding the one scapegoat.

If heroes need to exist they are the individuals who understand their role and submit themselves to the needs of team and to achieving the outcomes that are needed by the wider system. We like that the programme ends with the following words:

Search for progress, not someone to blame!

 

 

 

Leave a Reply