It’s like having 10 different remote controls for 10 different TVs

This NPR interview with Danielle Ofri, author of a new book on medical errors (and their prevention), had some interesting insight into how human factors play out during a pandemic. Her new book is “When We Do Harm,” and I was most interested in these excerpts from the interview: “…we got many donated ventilators. Many … Continue reading It’s like having 10 different remote controls for 10 different TVs

This NPR interview with Danielle Ofri, author of a new book on medical errors (and their prevention), had some interesting insight into how human factors play out during a pandemic.

Her new book is “When We Do Harm,” and I was most interested in these excerpts from the interview:

“…we got many donated ventilators. Many hospitals got that, and we needed them. … But it’s like having 10 different remote controls for 10 different TVs. It takes some time to figure that out. And we definitely saw things go wrong as people struggled to figure out how this remote control works from that one.”

“We had many patients being transferred from overloaded hospitals. And when patients come in a batch of 10 or 20, 30, 40, it is really a setup for things going wrong. So you have to be extremely careful in keeping the patients distinguished. We have to have a system set up to accept the transfers … [and] take the time to carefully sort patients out, especially if every patient comes with the same diagnosis, it is easy to mix patients up.”

And my favorite, even though it isn’t necessarily COVID-19 related:

“For example, … [with] a patient with diabetes … it won’t let me just put “diabetes.” It has to pick out one of the 50 possible variations of on- or off- insulin — with kidney problems, with neurologic problems and to what degree, in what stage — which are important, but I know that it’s there for billing. And each time I’m about to write about it, these 25 different things pop up and I have to address them right now. But of course, I’m not thinking about the billing diagnosis. I want to think about the diabetes. But this gets in the way of my train of thought. And it distracts me. And so I lose what I’m doing if I have to attend to these many things. And that’s really kind of the theme of medical records in the electronic form is that they’re made to be simple for billing and they’re not as logical, or they don’t think in the same logical way that clinicians do.”

Lion Air Crash from October 2018

From CNN: The passengers on the Lion Air 610 flight were on board one of Boeing’s newest, most advanced planes. The pilot and co-pilot of the 737 MAX 8 were more than experienced, with around 11,000 flying hours between them. The weather conditions were not an issue and the flight was routine. So what caused … Continue reading Lion Air Crash from October 2018

From CNN:

The passengers on the Lion Air 610 flight were on board one of Boeing’s newest, most advanced planes. The pilot and co-pilot of the 737 MAX 8 were more than experienced, with around 11,000 flying hours between them. The weather conditions were not an issue and the flight was routine. So what caused that plane to crash into the Java Sea just 13 minutes after takeoff?

I’ve been waiting for updated information on the Lion Air crash before posting details. When I first read about the accident it struck me as a collection of human factors safety violations in design. I’ve pulled together some of the news reports on the crash, organized by the types of problems experienced on the airplane.

1. “a cacophony of warnings”
Fortune Magazine reported on the number of warnings and alarms that began to sound as soon as the plane took flight. These same alarms occurred on its previous flight and there is some blaming of the victims here when they ask “If a previous crew was able to handle it, why not this one?”

The alerts included a so-called stick shaker — a loud device that makes a thumping noise and vibrates the control column to warn pilots they’re in danger of losing lift on the wings — and instruments that registered different readings for the captain and copilot, according to data presented to a panel of lawmakers in Jakarta Thursday.

2. New automation features, no training
The plane included new “anti-stall” technology that the airlines say was not explained well nor included in Boeing training materials.

In the past week, Boeing has stepped up its response by pushing back on suggestions that the company could have better alerted its customers to the jet’s new anti-stall feature. The three largest U.S. pilot unions and Lion Air’s operations director, Zwingly Silalahi, have expressed concern over what they said was a lack of information.

As was previously revealed by investigators, the plane’s angle-of-attack sensor on the captain’s side was providing dramatically different readings than the same device feeding the copilot’s instruments.

Angle of attack registers whether the plane’s nose is pointed above or below the oncoming air flow. A reading showing the nose is too high could signal a dangerous stall and the captain’s sensor was indicating more than 20 degrees higher than its counterpart. The stick shaker was activated on the captain’s side of the plane, but not the copilot’s, according to the data.

And more from CNN:

“Generally speaking, when there is a new delivery of aircraft — even though they are the same family — airline operators are required to send their pilots for training,” Bijan Vasigh, professor of economics and finance at Embry-Riddle Aeronautical University, told CNN.

Those training sessions generally take only a few days, but they give the pilots time to familiarize themselves with any new features or changes to the system, Vasigh said.
One of the MAX 8’s new features is an anti-stalling device, the maneuvering characteristics augmentation system (MCAS). If the MCAS detects that the plane is flying too slowly or steeply, and at risk of stalling, it can automatically lower the airplane’s nose.

It’s meant to be a safety mechanism. But the problem, according to Lion Air and a growing chorus of international pilots, was that no one knew about that system. Zwingli Silalahi, Lion Air’s operational director, said that Boeing did not suggest additional training for pilots operating the 737 MAX 8. “We didn’t receive any information from Boeing or from regulator about that additional training for our pilots,” Zwingli told CNN Wednesday.

“We don’t have that in the manual of the Boeing 737 MAX 8. That’s why we don’t have the special training for that specific situation,” he said.

Hawaii False Alarm: The story that keeps on giving

Right after the Hawaii false nuclear alarm, I posted about how the user interface seemed to contribute to the error. At the time, sources were reporting it as a “dropdown” menu. Well, that wasn’t exactly true, but in the last few weeks it’s become clear that truth is stranger than fiction. Here is a run-down … Continue reading Hawaii False Alarm: The story that keeps on giving

The post Hawaii False Alarm: The story that keeps on giving first appeared on the Human Factors Blog.

Right after the Hawaii false nuclear alarm, I posted about how the user interface seemed to contribute to the error. At the time, sources were reporting it as a “dropdown” menu. Well, that wasn’t exactly true, but in the last few weeks it’s become clear that truth is stranger than fiction. Here is a run-down of the news on the story (spoiler, every step is a human factors-related issue):

  • Hawaii nuclear attack alarms are sounded, also sending alerts to cell phones across the state
  • Alarm is noted as false and the state struggles to get that message out to the panicked public
  • Error is blamed on a confusing drop-down interface: “From a drop-down menu on a computer program, he saw two options: “Test missile alert” and “Missile alert.”
  • The actual interface is found and shown – rather than a drop-down menu it’s just closely clustered links on a 1990s-era website-looking interface that say “DRILL-PACOM(CDW)-STATE ONLY” and “PACOM(CDW)-STATE ONLY”
  • It comes to light that part of the reason the wrong alert stood for 38 minutes was because the Governor didn’t remember his twitter login and password
  • Latest news: the employee who sounded the alarm says it wasn’t an error, he heard this was “not a drill” and acted accordingly to trigger the real alarm

The now-fired employee has spoken up, saying he was sure of his actions and “did what I was trained to do.” When asked what he’d do differently, he said “nothing,” because everything he saw and heard at the time made him think this was not a drill. His firing is clearly an attempt by Hawaii to get rid of a ‘bad apple.’ Problem solved?

It seems like a good time for my favorite reminder from Sidney Dekker’s book, “The Field Guide to Human Error Investigations” (abridged):

To protect safe systems from the vagaries of human behavior, recommendations typically propose to:

    • Tighten procedures and close regulatory gaps. This reduces the bandwidth in which people operate. It leaves less room for error.
    • Introduce more technology to monitor or replace human work. If machines do the work, then humans can no longer make errors doing it. And if machines monitor human work, they ca
    snuff out any erratic human behavior.
    • Make sure that defective practitioners (the bad apples) do not contribute to system breakdown again. Put them on “administrative leave”; demote them to a lower status; educate or pressure them to behave better next time; instill some fear in them and their peers by taking them to court or reprimanding them.

In this view of human error, investigations can safely conclude with the label “human error”—by whatever name (for example: ignoring a warning light, violating a procedure). Such a conclusion and its implications supposedly get to the causes of system failure.

AN ILLUSION OF PROGRESS ON SAFETY
The shortcomings of the bad apple theory are severe and deep. Progress on safety based on this view is often a short-lived illusion. For example, focusing on individual failures does not take away the underlying problem. Removing “defective” practitioners (throwing out the bad apples) fails to remove the potential for the errors they made.

…[T]rying to change your people by setting examples, or changing the make-up of your operational workforce by removing bad apples, has little long-term effect if the basic conditions that people work under are left unamended.

A ‘bad apple’ is often just a scapegoat that makes people feel better by giving a focus for blame. Real improvements and safety happen by improving the system, not by getting rid of employees who were forced to work within a problematic system.

The post Hawaii False Alarm: The story that keeps on giving first appeared on the Human Factors Blog.

‘Mom, are we going to die today? Why won’t you answer me?’ – False Nuclear Alarm in Hawaii Due to User Interface

Image from the New York Times The morning of January 13th, people in Hawaii received a false alarm that the island was under nuclear attack. One of the messages people received was via cell phones and it said:“BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.” Today, the Washington Post … Continue reading ‘Mom, are we going to die today? Why won’t you answer me?’ – False Nuclear Alarm in Hawaii Due to User Interface

The post ‘Mom, are we going to die today? Why won’t you answer me?’ – False Nuclear Alarm in Hawaii Due to User Interface first appeared on the Human Factors Blog.


Image from the New York Times

The morning of January 13th, people in Hawaii received a false alarm that the island was under nuclear attack. One of the messages people received was via cell phones and it said:“BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.” Today, the Washington Post reported that the alarm was due to an employee pushing the “wrong button” when trying to test the nuclear alarm system.

The quote in the title of this post is from another Washington Post article where people experiencing the alarm were interviewed.

To sum up the issue, the alarm is triggered by choosing an option in a drop down menu, which had options for “Test missile alert” and “Missile alert.” The employee chose the wrong dropdown and, once chosen, the system had no way to reverse the alarm.

A nuclear alarm system should be subjected to particularly high usability requirements, but this system didn’t even conform to Nielson’s 10 heuristics. It violates:

  • User control and freedom: Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.
  • Visibility of system status: The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.
  • Error prevention: Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.
  • Help users recognize, diagnose, and recover from errors: Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
  • And those are just the ones I could identify from reading the Washington Post article! Perhaps a human factors analysis will become regulated for these systems as it has been for the FDA and medical devices.

    The post ‘Mom, are we going to die today? Why won’t you answer me?’ – False Nuclear Alarm in Hawaii Due to User Interface first appeared on the Human Factors Blog.