Estimated read time: 13 mins
Errors and mistakes are as old as humanity and occur regularly, at work, at home, and almost everywhere in between. Over 2,000 years ago, the Roman orator Cicero wrote that anyone is liable to make a mistake, but only a fool persists in error. Sadly, in healthcare, and across medicine and dentistry, we continue to make the same mistakes time and again.
Learning and disseminating lessons from mistakes is important to try to reduce or prevent them from happening for patients. However, as humans, we can never eliminate error. In this regard, the term ‘never event’ – defined by the UK National Health Service as a serious incident that is wholly preventable because guidance or strong systemic protective barriers are available – is a complete misnomer.
Within the healthcare environment, approximately 1 in 20 hospital admissions have some form of error. This might be something relatively minor, like omitting some drugs on a discharge summary, or forgetting to complete one while distracted. But of these mistakes, around 1 in 20 (so approximately 1 in 400 admissions) can result in a serious medical or surgical error.1 Many mistakes are preventable, however, by understanding various human factors (HF) in clinical practice.
Before discussing human factors, we need to briefly consider the different types of error seen in clinical practice. Most errors are multi-factorial, and their origin might begin long before an actual mistake happens. For example, heavily overbooked clinics or operating lists could lead to something being missed, and result in the origin of an error that occurs later. These are called latent (or organisational) failures. At an individual or clinical team level, the two main types of error are inadvertent or deliberate (violation).
Inadvertent errors happen for a range of reasons, including slips (eg using the wrong diathermy foot pedal), lapses (eg leaving a swab inside a patient due to lack of concentration, distraction, or multi-tasking), or mistakes that can be rule- or knowledge- based. For example, a rule-based mistake could occur by prescribing a standard dose of antibiotic but misjudging the patient’s weight. A knowledge-based mistake might be a wrong diagnosis due to lack of experience, and then performing the wrong procedure as a result.
Fortunately, only a very small minority of professionals go to work with the intent of causing patient harm – the UK general medical practitioner, Dr Harold Shipman, and UK breast surgeon, Mr Ian Paterson are well known names in this regard. But what might seem to be even a ‘minor’ violation, such as not listening to or engaging with the WHO checklist before an operation, can have serious consequences. Other violation examples include continuing to operate knowing that there is a hole in a surgical glove, or not wearing PPE visors because they interfere with vision. There are many other examples too, depending on specialty.
HF is a complex science, and it is regularly used as an umbrella term for many different interpretations and definitions. While some HF and ergonomics experts might object to using the term ‘human factors’ for various behavioural aspects – including team dynamics and hierarchical gradients between different professionals – for practising clinicians the clue is very much in the title. Human factors are about how we interact with others, and how our analysis, decision making, and behaviour can potentially adversely affect patient care, safety, and professional relationships. It is of course essential that processes and system designs – including appropriate checklists, protocols, failsafes, and other measures – are in place to help reduce error, particularly when working with complex equipment and technology.
Therefore, HF also includes systems design (having processes or systems in place to reduce error, including, for example, the WHO and other checklists), ergonomics (how we interact with equipment and technology), and human performance. HF experts may focus on one or more areas, such as designing complex systems to reduce error or having experience in the psychology of human performance. Most clinicians in healthcare will not be formally trained in HF design, but can apply the many factors that affect individual and team performance to improve patient safety and team working.
The UK’s medical regulator, the General Medical Council (GMC), recognises the importance of HF education and training. Human factors are now included as part of the Generic Professional Capabilities (GPCs) needed for safe and effective medical practice. The GMC also acknowledge HF as potentially contributory causes in fitness to practise referrals to the regulator.2 The UK General Dental Council (GDC) encourages dental teams to incorporate HF understanding into their work, looking at how these can contribute to error, and putting safety measures in place to help avoid mistakes.
Some of the most important areas that affect both individual and team performance relevant to clinical practice include effective and unambiguous communication – especially during safety critical times – maintaining situational awareness (what is going on all around us), managing workload, recognising the potentially adverse effects of distraction, and looking at how performance slowly deteriorates over time.
Many mistakes occur because of poor communication between team members. Indeed, communication issues are likely to be the biggest contributor to NHS never events, including wrong site surgery and retained swabs.3 As the ‘sender’ of information, we might assume that it has been heard – or read – and understood by the ‘receiver’, but this might not be the case.
Potential ambiguity can occur in both written information, such as prescribing dental extractions that are not absolutely clear and without doubt to the operating practitioner, or verbal instructions that are misinterpreted or even not heard due to background noise. The use of pronouns, eg it, that, they, them, are not recommended during safety critical times. It is much better to use proper nouns, especially in telephone calls, for example, “please give the vancomycin IV” , rather than “please give it IV”. Repeat back or some verbal interpretation by the receiver is a useful way to ensure instructions have been heard and understood. If there is any doubt, especially with written instructions, the safest option is not to proceed, apologise to the patient, and seek clarification.
During an aircraft pre-flight briefing, we are told that in the event of a sudden cabin depressurisation, to put our own oxygen mask on before helping others. This is important, as passengers would only have limited time before becoming unconscious. Similarly, optimising our own performance first before caring for patients can improve safety and reduce error. Many staff come to work in the morning without eating breakfast, and are therefore in a fasting state, burning body fat, and generating ketones. There are studies showing that those who eat breakfast perform much better than those who do not. Similarly, missing lunch can result in the same biochemical fasting state, with reduced performance. And it is not just regular food that can make a difference. A 1-2kg loss in total body water due to perspiration or lack of rehydration can reduce analysis and decision making by up to 20%. Taking a short break every three to four hours during a long complex operation to eat, drink, and regain energy helps to maintain performance levels. Most would not drive for more than a few hours before stopping to take a comfort break, yet it seems to be acceptable to do so when caring for others.4
When something does not seem quite right, and when safe to do so, it can be good practice to stop, step back, and appraise the clinical situation before continuing. It sounds obvious to do so when not actually in a stressful environment, but when performing under pressure this simplest of actions can be overlooked. Where appropriate, an easily remembered mini brief called PPP (patient, procedure, people), can be useful to focus discussion.
While factors including incivility, bullying, and discrimination are not HF in its purest definition, these behaviours can, and do, influence performance, decision making, and team morale, and have significant detrimental effects.5 Similarly, anger and shouting – particularly during stressful times – significantly increases the risk of error, not to mention its effect on team working. The evolutionary primitive limbic system can ‘hijack’ higher brain functions in such circumstances. It is far better to stop, think, and let higher functions catch up before acting. Most colleagues who shout or belittle others subsequently regret their actions. If they had stopped for even a few seconds, these colleagues might have behaved differently, and not potentially lost respect from the team. Behaving and acting towards colleagues in a way that we would wish a member of our family to be treated is a good starting point to change culture. Similarly, lowering authority gradients so that all team members can question or challenge more senior colleagues or professionals without fear of retribution is good practice and can improve safety. Staff should be actively empowered to speak up if they have any concerns that might potentially avert a serious error, for example a dental nurse challenging a dental surgeon to prevent a potential wrong tooth extraction. Always put the patient first.
Space precludes a proper discussion of some of the many pressures and factors that can raise the risk of error. These include time constraints, disruption to circadian rhythms, fatigue and tiredness, confirmation bias, distraction, and multitasking. Two of these deserve further attention.
Confirmation bias uses information, diagnostic tests, or relevant anatomy to confirm one’s decisions or actions. This is yet another example of where good team working and asking others for their input or suggestions can help avoid potential problems or errors.
Distraction occurs frequently and can become a potential issue at safety critical times. A recent study found that fewer than half of invasive cardiac catheter procedures were completed without distraction and many of them were during high-risk stages of procedures.6 During these times, or when intense concentration is required, minimal or no distraction is recommended. This can be conveyed to the team, or for example, during complex radiology reporting, a notice can be left on the door to reduce unnecessary interruptions.
Whatever your area of clinical practice, the team brief is a great opportunity to help build good working relationships, and if used, actively engage with checklists as if one’s own life, limb, or teeth depends on it (Figure 1). It is good HF practice to think about any relevant ‘what if?’ scenarios to avoid potential startle reactions that can lead to mistakes and to build situational awareness (SA). The latter can be defined relatively simply as being aware of what is going on around us. It is a dynamic process that can deteriorate for both individuals and teams. Whole teams have lost SA with tragic consequences.
Finally, as with other high reliability organisations, healthcare needs a ‘Just Culture’, so that when problems arise, they can be discussed in an open and non-accusatory manner. The approach to learn from mistakes should be “Why did this happen?” rather than “Who was to blame?” Acknowledging our own fallibility and understanding human factors in daily practice is so important to help reduce preventable error, as well as enhancing patient safety, team working, and morale.
If you enjoyed this article, why not join Professor Peter Brennan for his webinar on 18 December, click below to book your place.