Centre for Information Management

Research blog

Blog

blog photo

Safety Critical Systems: A Question of Judgement

We rely on many systems every day that can be lethal in the wrong hands, or deadly due to a fault or a poorly designed component. These systems, such as a car, or a plane, or train are often called safety critical systems.  But how do you know if the car you are driving, the plane you are flying in, the train you are a passenger on, is safe? How do you know that the infrastructure your transport system is utilising to get you from A to B is safe? Well of course you don’t, you rely on others with the expertise and experience to make that judgement for you and that’s where it all starts to get very interesting.

Physical systems can be tested to destruction, software can be mathematically checked and verified and when accidents or mishaps do occur, lessons can be learnt and fed back into the design process. But as products get more and more complex the traditional approaches to testing, verifying and validating systems lag behind. More and more we rely on the judgement and experience of the experts in the field to decide how much testing is enough, to decide if additional safeguards are needed before the system goes ‘live’ and in the final judgement – to decide how safe is ‘safe enough’.

The big manufacturers in Automotive, Aerospace and Rail all recognise that no matter how well designed their processes or how thorough they are, if a case comes to court they have to be able to establish that the best engineers were in the room when crucial decisions were taken. They also need to be able to show that the people with the intellectual know-how were free to take the right decision without undue pressure from programme managers on tight delivery deadlines and budgets and that if genuine concerns were expressed that these were taken seriously. Establishing all this requires attention to what has become known as the ‘safety culture’ within a company, or industry. A safety culture is practiced –it cannot be captured in policy documents or minutes of meetings –it is what happens every single day in a company that designs or builds components and products that have a safety critical nature.

Creating a safety culture is a continual challenge for many organisations and much of my research over the past few years has focused on how to plan for, manage and reflect on the subjective, unmeasurable aspects of the design process, particularly where complex systems and products are being created. The PEArL[1] framework has been developed as an easy-to-use device to support people in managing and validating the design of new systems, in particular safety critical systems (see the proceedings of the Safety Critical Systems Club recent conference: http://scsc.org.uk/p126).

PEArL is based on the view that all judgement, even expert judgement, is subjective, a view most safety engineers instinctively understand. Mathematical modelling only gets you so far –the rest is educated guess work. Safety engineers need to be able to integrate ‘hard’ information from models and calculations that has been verified as correct and ‘soft’ information that is subjective and requires experience to evaluate. PEArL is designed to help managers focus on the subjective aspects of the systems design and validation process –the bits that teams make up for themselves in every project –the sum parts that go to make up the whole culture.  And although culture is difficult to pin down –it is actually the system that keeps you safest. Because it is the cultural practices in organisations that will facilitate assumptions being questioned and enable ‘group think’ to be challenged.

So next time you are in a car, or a plane or a train, spare a thought for those Safety engineers who have to judge how safe is safe enough and have to be prepared to defend that decision in court if necessary.

 

 

 


[1] PEArL: Participants; Engagement; Authority; relationships and Learning

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

*