fiona@think4purpose.ie
What's the diagnosis on good decision-making?
Burn-out amongst healthcare clinicians has multiple causes but studies indicate that it rises in line with direct patient contact. If so, Artificial Intelligence has the potential to reduce both; for example by analysing x-rays to identify certain pathologies. However, the same cognitive science that shapes A.I. reveals that professional experience and intelligence (once hits a threshold) may actually impair decision-making, including clinical and diagnostic. With the advent of 4P’s Medicine (participatory, personalised, preventive and predictive), can these insights and the type of training they inspire also help industry professionals to reduce their stress levels and decide well?
.
We’re Natural Born decision-takers
One school of cognitive science focusses on our natural capability to make sound ‘flesh and blood’ decisions every day, often under pressure. This is ’Naturalistic Decision-Making’ is based on a satisficing principle but it could be more accurately described as ‘taking’ in that most of the time, we don’t formally consider the pro’s and con’s.
Equally, clinicians routinely spot cues from a set of symptoms, pattern-match with a likely illness and mentally simulate a single course in order to act quickly. By mapping this ‘critical path method’, we can devise useful checklists and decision trees which then lessen the ‘cognitive load’ for junior staff. However, such ‘fast and frugal’ reasoning is recognition-primed and, hence, its successful outcome is situation-dependant. Cognitively, we’re not as well-suited to decision landscapes that are complex, novel or more abstract. If we want to improve the predictive power and reliability of our intuition under such conditions, we need to engage in ‘metacognition’ (i.e. thinking about our thinking processes). As this article explores, this is what clinical education and training has begun to do and with some good results.
Why Subjectivity matters - both more (& less) than we think
Industry has been inspired to view cognition from a research perspective that emphasises its constraints. The frame is one of mistake-avoidance by controlling for the heuristics (informal rules of thumb) and biases (mental shortcuts) we each apply subconsciously. For example, we tend to search for supporting rather than refuting evidence in hypothesis-testing (‘confirmation bias’). A science-based approach helps to mitigate this through its ‘falsification principle’ and Q.R.M. professionals are also advised to minimise subjectivity in risk their assessments using quantitative means.
Yet the scientific method itself does allow for subjectivity and this may even be useful. After all, hypothesis creation is the fruit of conjecture as, by definition, a hypothesis is unproven. Neuroscientists also suggest that subjectivity plays a key role in decision-making. Our rational processes work alongside emotional (‘affective’) ones that acts as a value brokers. Without them without which, we struggle to make any decisions (Damasio).
Subjectivity doesn’t necessarily imply less rigour and also offers a distinct epistemic (knowledge) benefit in balancing complexity with precision to overcome the ‘principle of incompatibility’. We can manage subjectivity itself more scientifically by ensuring it is proportionate for each decision task and this is key.
How collective reasoning works – and doesn’t
We rely more on subjective judgement in the form of specialist expertise e.g. to project the likelihood of a future event during an F.M.E.A.. For balanced decisions, a multistakeholder approach is recommended but the reality is that inter-disciplinary decision-making is a challenge, both in industry and healthcare. Neuroscience, however, helps us understand better why this process is so complex and how to manage the trade-off’s involved.
The good ‘news’ is that reasoning itself evolved primarily as a social function i.e. to persuade. When we think together, many of our blind-spots (cognitive biases) cancel each other out. Indeed, studies show social dynamics significantly sway even expert opinion during group discussions. The caveat to produce better collective decisions is sufficiently varied reasoning styles.
The bad news is our most powerful heuristic, i.e. decision rule, is identity with social influences more than new information causing convergence, divergence and herding behaviour. Knowledge workers can tend to veer discussions towards ‘what we all know’ and will often self-censor rather than probe S.M.E. assumptions or the knowledge domains of others. Studies also show that due diligence suffers when trust is rushed e.g. on newly-formed project teams. By training teams to spot affective biases and signs of dysfunctionality e.g. ‘false consensus’, we can nurture the type of group processes that get the balance right e.g. the ‘creative tension’ for innovation.
Where improving decision quality counts most
Clinical education is now well-informed by cognitive (and affective) evidence on how to improve decision-making ability. De-biassing helps physicians pay attention to the dopamine hit associated with judging others (e.g. based on patient stereotypes). Interestingly, past education is now seen as problematic by conditioning towards perfectionism including unrealistic expectations of themselves and others. Mistake-avoidance actually impairs decisions by motivating towards minimum goals rather than maximum outcomes.
As foundational in medicine, the decision situation of a patient diagnosis has come under the microscope. Not unlike working up to a Root Cause Analysis, each stage of choosing an active intervention to stabilise the situation, investigating (including iterative hypothesis-testing) and refining towards a definitive cause is subject to an aggregate of dispositions. ‘Availability heuristic’ causes recently discovered ‘root causes’ to over-influence future investigations. As autopsy, as they say, is useful to everyone but the patient. ‘Outcome knowledge’ also biases our thinking on the quality of the processes that lead to a particular adverse result. Yet we can correct for many of these by evolving our decision methods (e.g. DMAIC began originally as ‘MAIC’) and broadening the repertoire of decision styles as is underway in medicine. Industry tends to favour directive and analytical styles over more conceptual and behavioural.
Why to err in decision-making is human but also correctable
Most of us have erred in failing to learn from past mistakes, including poor decisions. ‘Human error’ is also given as a common explanation for poor quality but perhaps this is becoming less plausible given all that cognitive science can teach us. Cognitive training can equip healthcare and industry alike to express doubt, balance uncertainty and alleviate stress, particularly for stakeholders who share the biggest decision burdens.
Think4Purpose designs and delivers cognitive training to improve the critical thinking skills of STEM professionals. These programmes include modules on Heuristics and Biases, Value-based Reasoning, Team Cognition and Improving Self-Knowledge. Email fiona@think4purpose.ie to enquire for 2022

or book through the IBEC Engineering Skillnet for subsidised blended learning programmes; resuming in February.