Doctors, traffic controllers and similar professionals who have other people’s lives in their hands are increasingly being held liable for the consequences of mistakes that they are bound to make at some time or other whilst exercising their profession.
This phenomenon has not yet begun to affect engineers, because any fatal consequences caused by technical design faults tend to be only indirect in nature. But this is merely a matter of time.
When the police found her in the boot of her stepfather’s car, the girl had a rag in her mouth, held in place by a bandage around her head. Investigations revealed that her mother had placed the rag there three days previously. She had then pushed the child under the bed and left her alone. There, she died in the dark, three years old and severely undernourished. The stepfather was on his way to the woods to dump the body.
The mother was sentenced to six years in prison and forced to undergo treatment. But that was not enough to satisfy the American justice system. A second person was also deemed to be guilty. The stepfather, you might think, but he was in fact released without charge. Instead, charges were levelled against a social worker.
The family had been under the supervision by the authorities for quite some time. The girl had even been taken into care, but had since been returned to her mother. In her final report before going on sick leave, the social worker handling the case said that things were now going well. It was months until another social worker revisited the family. The replacement social worker drew up a plan for the mother, including a detailed eating schedule for the child. Then the girl was suddenly dead. And the social worker was guilty of serious dereliction of duty, resulting in manslaughter.
The story of the social worker is one of the examples cited by Sidney Dekker in his book ‘Just Culture’ of professionals who do their work honourably and with good conscience, appear to fail in their duty in one way or another and are then suddenly branded as criminals.
Dekker also mentions the nurse who misinterpreted a doctors scribbled handwriting and administered an overdose of muscle relaxant to a baby as a result. These kinds of gross errors occur hundreds of times every year, but the nurse had the bad luck that the baby died – and she ended up in the dock. In neither of these cases does Dekker, professor at Griffith University in Brisbane, Australia, and a Boeing 737 pilot, provide details of what happens in the end. He prefers to leave the cases open, because this forces people to come to their own decisions as to the justice of the case.
Stricter procedures
In these kinds of cases, there is no such thing as absolute, objective justice, argues Dekker. There are too many different actors involved. The victims want redress or retribution. The professional did not act deliberately or may feel he or she had no other choice in the circumstances. His or her employer is in an ambiguous position: on the one hand he or she is also accused, but it may be more tempting to attribute the error to the individual professional rather than organisational issues. The public prosecutor may see it as his or her duty to find a guilty party for an avoidable death. The judge must ascertain the facts and the fairness of any punishment. Finally, it is up to politicians to lay down in law what society feels is justice.
Here in the Netherlands, there have been several cases involving murdered children where juvenile care seemed to fail in its duty, as in the case of the girl Géssica whose dismembered body was found in the Nieuwe Maas river near Rotterdam. The guilt was not placed at the door of individual social workers; a failure of the system was deemed to be to blame. Stricter procedures and better working methods were introduced as a result: changes to the system intended to prevent any repetition. This is all very noble and indeed necessary, but it also increases the likelihood of an individual youth worker breaking the rules unintentionally or being forced to do so by circumstances. If something goes wrong again, there is a greater chance that an individual can be singled out as the person that committed the fatal error.
Miscalculations
The introduction of ever stricter regulations makes the criminalisation of human shortcomings an issue of relevance for engineers too. They also operate in an environment in which procedures are becoming increasingly important: it is now recorded in extreme detail who took which action and when, especially in environments where high levels of safety are required, such as the oil and gas industry, medical technology and aircraft construction. A miscalculation in the structural analysis of a dike, an incorrect estimate of metal fatigue in the blade of a wind turbine, overlooked anomalies when programming a nuclear power station: the list of potentially fatal errors engineers can make could go on and on.
It is worth pointing out that truck drivers, for example, have faced this problem for a long time: if you spend all day driving, you will eventually make mistakes and if you are unlucky, you can easily kill a cyclist. Nurses, social workers and engineers are therefore not alone in being potential victims of this trend. However, the circumstances of the errors they make are more complicated and therefore the moral repercussions more convoluted.
Guilty engineers
In the Netherlands, there have not yet been any known cases of engineers ending up behind bars as a result of an error committed in the course of their profession. But that is merely the ultimate consequence. There is a growing army of professional associations and committees supervising the rules and calling those who break them to account.
Here are just a few international examples. John Ruffini, a civil engineer employed by the Australian province of Queensland is currently being prosecuted because he failed to renew his membership of the professional association on time. This emerged during an enquiry into serious floods. Because he was not registered as a member, Ruffini was actually not authorised to take the action he did. And it was the professional association itself that reported Ruffini to the authorities. In the same case, three other engineers accused of failing to follow a manual correctly were recently acquitted because the manual was slightly ambiguous.
In New Zealand, engineer Dick Cusiel recently admitted that he felt responsible for the death of a woman hit by a piece of falling concrete during the earthquake in Christchurch. He had been the person ultimately responsible for approving a drawing that turned out to contain a design error, which meant that the concrete was not secured firmly enough.
Closer to home, a British architects firm was recently forced to pay damages amounting to more than EUR 200,000. They had altered the design of a building during construction, as a result of which the air conditioning was positioned on the roof. However, in doing so, they failed to design a fall protection system on the roof, with fatal consequences for a maintenance engineer. In this case, no individual architects ended up in prison – perhaps the ultimate nightmare scenario – but it is part of a growing trend of holding designers liable for their actions.
Concealing the truth
This tendency to pursue professionals (through the courts) for professional errors not only impacts those professionals, but also their employers and society as a whole. After all, will anybody want to become a surgeon if it is increasingly likely that any mistake could mean you end up in prison? Sidney Dekker devotes much of his book to this question: how do you offer people a safe working environment in which they can still feel maximum responsibility for their work?
First of all, it is important to determine that not all errors are of the same order. Honestly overlooking something is different from taking irresponsible risks or deliberately ignoring instructions (not to mention destroying the evidence, which could result in engineer Kurt Mix being imprisoned for twenty years in the case of the exploded oil Deepwater Horizon). But is it really justice if you are sentenced to six years in prison for failing to speak out after sitting at a press conference next to a civil servant who wrongly claimed that there would be no earthquake in the village of L’Aquila, as happened to six Italian scientists and engineers earlier this year?
In order to even begin to provide an answer to this kind of question, an organisation needs to take stock of all the errors, discuss them openly and learn from them. In this process, it is essential that mistakes remain confidential in the first instance because research shows that people are less likely to report errors if they feel that doing so would disadvantage them personally – unless they think that someone else has made the mistake, because an attempt to conceal something is also deemed to be wrongdoing. A lawsuit against three air traffic controllers in the United States ultimately ended in acquittal but also resulted in the number of reported incidents being halved and therefore reduced the ability of air traffic control to learn from its mistakes and increase safety.
But you cannot conclude from this kind of example that it is a bad idea to criminalise professional errors, warns Dekker. Justice makes sense and prevents professional groups from evading the law by setting their own rules. Why should an inattentive truck driver end up in court and not a negligent doctor or engineer? Quite understandably, the argument that certain professional practices are so complex that they are beyond the comprehension of the judge is unlikely to impress anyone.
Doing nothing is not an option
However complicated the dilemmas may be that you face when you decide to tackle this issue, doing nothing is not an option. In a society that increasingly calls for public accountability for human failure, every organisation must consider what could happen to it and its employees. Dekker proposes a four-step plan for this.
It starts within the organisation itself. Make sure there is good training in place and a transparent system for reporting incidents. See these incidents above all as an opportunity to learn and avoid those involved being directly stigmatised.
Where possible, remove the process of dealing with incidents from the line organisation. Make sure that everyone knows in advance how incidents will be handled and by whom. Bear in mind that dealing with incidents will not only involve those directly concerned but may also result in the introduction of additional training for the entire team or others in similar positions.
Step three: protect data from the outside world. This is important in order to create the best possible learning environment for incidents in an atmosphere of confidence and trust. But public accountability is also an issue.
This is why step four involves putting a legal framework in place. In other words: make sure that the internal structure is in line with the law and society as a whole. If you can assure the legal system that you investigate serious incidents effectively and take action as a result, it reduces the likelihood that others will do this for you.
However, this is not a simple step-by-step plan, especially in a society that is in flux. The debate about all of this has hardly even reached much of the world of engineering. Nevertheless, it is not advisable simply to wait until the first lawsuit suddenly lands on your doorstep.
‘Just culture, balancing safety and accountability’ by Sidney Dekker was recently published in a second edition by Ashgate Publishing (EUR 21.99). The consequences for engineering practice were not taken from the book.
Christian Jongeneel a science journalist and author of the book ‘Het zit in een lab en het heeft gelijk’ (‘It’s in a lab and it’s right’), on the position of science in modern society.
Comments are closed.