Imagine waking up and going about your morning routine when all of the devices in your home go off with this warning.
In case you missed it, two days ago this is what residents of Hawaii saw pop-up on their smart phones. If the information had been accurate, according to the Department of Defense residents of Hawaii would have had 15 minutes from the time US intelligence picked up the ballistic missile and issued the alert until an atomic bomb exploded on the islands.
Fifteen minutes — which makes it completely understandable that people went into panic mode.
They found out 38 minutes later that the alert was a mistake.
How did this mistake happen? Someone clicked the wrong button during a drill. It was human error. User error. I know, you’re thinking, how dare he (or she) make that kind of horrible mistake! It’s an understandable reaction, except that human error is almost never the fault of the person; it’s the fault of technology that was not engineered with human behavior in mind.
The user who made the mistake spoke about his mistake. They said that in the moment they were unaware of the mistake they were making.
If you’re someone who is still unsure what user experience designers do, it’s preventing things like this from happening. Regardless of how sleek the software is designed to be, the most important thing to ask is, “Can a user accomplish their task with ease?”
The menu screen the user navigated to send off the alarm. Image Source.
Correct drill (green), incorrect drill (red), and the new “False Alarm” option (purple).
It was reported by several news outlets that a single drop-down menu contained adjacent controls for selecting a ballistic missile test vs. actual ballistic missile. If even a low-cost heuristic analysis had been performed, this design flaw could have been uncovered. Not only this, but it is only a matter of time before something like this happens again. Where one major usability breakdown exists, others lurk. In this case, the user was unable to cancel the missile alert. In the case that someone would set off this alarm, undoing the mistake should have been afforded through the alarm systems design.
For all the talk about and corporate investment in automation, machine-learning and technology that replaces the need for human involvement, the majority of software systems are still used by humans to help them do their jobs. Yet so many systems are actually working against us — not for us — because of poor design.
Granted, the stakes are not always so high, but there is always value in ensuring your systems are designed with intention and with surety users can accurately accomplish what needs to be done.
Remember all technology is designed by someone, whether they know what they’re doing or not, and regardless of whether they’ve ever spoken with or observed your end users.
The question you should be asking is this: how much money am I leaving on the table from technology that is not optimized by design?