In this taster Executive MBA session, we challenge the usefulness of the term 'human error' as it is commonly used.

Calendar
Tuesday 28 July 2020
Clock
18:00–21:00
Microphone
Nick Oliver, Professor of Management and Co-director of the Edinburgh Strategic Resilience Initiative

Overview

Delivered by an internationally renowned faculty over two years, our triple-accredited Executive MBA programme combines advanced academic theory and practical application with one-to-one coaching in small but diverse classes. You can get a taste of the MBA programme in this specialist talk by Nick Oliver, Professor of Management and Co-director of the Edinburgh Strategic Resilience Initiative.

This will be a highly interactive and practical session, drawn from a week-long elective of which is taught to all Executive MBA participants.

If you have any questions regarding the event, please contact Copil Yáñez.

Don't Shoot the Operator

Accidents and other negative events are often blamed on 'human error'. The 'humans' in question are usually those on the front line (pilots, doctors, nurses, plant operators) and the 'errors' are the actions (or lack thereof) of these front-line actors. The human error story contrasts with a 'system' or 'design' story in which features of the system within which these front-line actors operate are designated as primary causes of mishaps.

Ascribing an event to human error carries implications. Post-event corrective actions are likely to emphasises training, greater compliance with existing procedures, and perhaps more (and more specific) procedures. However, such measures cannot eliminate the risk of recurrence if in fact the underlying issue is more complicated than this; something graphically illustrated by the Boeing 737MAX accidents in 2018 and 2019.

The first accident was initially ascribed to inadequate safety culture on the part of the airline, and a failure to follow procedure on the part of the pilots. It was only after a second accident in very similar circumstances that a systemic problem was acknowledged, and even then, not immediately or by all stakeholders.

In this session, we challenge the usefulness of the term 'human error' as it is commonly used, particularly in settings that are complex and/or technologically intensive. We discuss the idea that all systems have a 'sharp end' of front-line actors (pilots, doctors, nurses) and a 'blunt end' (designers, engineers, executives, regulators, technology suppliers) and that decisions made at the blunt end are inextricably intertwined with actions at the sharp end. Indeed, a key role of people at the sharp end is to fill the holes left by those at the blunt end, by dealing with everything that the blunt end had been unable to foresee or design for. We illustrate these ideas with reference to Air France 447 and the Boeing MAX.

Although human error often fails to fully explain negative events, the narrative remains very prevalent. We suggest that this is because the narrative implies that lapses are idiosyncratic and because immediate and visible actions can usually be taken to address these lapses. This supports the legitimacy of the system and provides reassurance to the systems' stakeholders.

Further Information