A false alarm can cause unnecessary panic and distress, as the high-profile alert in Hawaii grimly demonstrated on 13th January 2018. A combination of failures led to the public believing a missile attack from North Korea was imminent and they were in immediate danger – a truly terrifying ordeal for everyone concerned.
On a less dramatic level, dealing with false alarms can be expensive and frustrating. For example, the UK Government estimates the cost of false alarms to the fire services is around £1 billion a year.
Whether false alarms are the result of erroneous signals from detection equipment, malicious/hoax calls or through genuine mistakes, the fallout can be just as problematic.
Critical response systems (fire, security, business or healthcare) need to balance fast reactions with reliability to ensure trust and to protect safety. Naturally human errors can happen, but safeguards need to be in place to evaluate messages before they are sent to all. Preferably this needs to be done automatically and rapidly to ensure the effectiveness of the system.
Any delay in correction can also cause issues. If the emergency is real, then the response must be swift. But, if it is a false positive then the teams involved need to be informed as soon as possible, so they can stand down and prepare for a real emergency.
As the Hawaii incident proved, the public need to be able to trust alerts too. Panic and anger are natural responses to a false alert, along with increased apathy in future, risking an assumption that alerts could be ‘crying wolf’.
A disturbing possibility is that false positives can also be malicious, especially if a system is hacked. It is vital that effective security measures are in place, that teams are alert to this possibility and wait for appropriate confirmation before acting.
To aid this, up-to-date and demonstrable staff training is essential. Strict procedures need to be in place to deal with false positives as well as confirmed emergencies.
The right balance of safeguards, security and agility are all vital in critical communications systems. To this end it is essential you get messages to the right person in quick time to review alarms false or real.
This was demonstrated back in 1983 when Soviet officer Stanislav Petrov reportedly averted nuclear disaster by rationalising a potential attack was a false positive. The consequences of making the wrong decision there would have been truly global.