The Current

From Thanksgiving dinners to nuclear meltdowns, why complex systems are often doomed to fail

The authors of a new book say we need to learn how complexity causes failure in all kinds of modern systems — from social media to air travel — so we can prevent meltdowns in society, and our daily lives.

If systems have no slack, small errors can quickly snowball, say authors of new book

András Tilscik and Chris Clearfield have written Meltdown, a book which looks at how systems that are overly complex can experience catastrophic failures. (Penguin Random House)

Ready Story Transcript

If you've ever ruined Thanksgiving dinner, you've made the same mistakes that lead to huge disasters like the one at the Fukushima nuclear power plant in Japan, according to the authors of a new book about catastrophic failures.

People think the disaster at Fukushima was inevitable because it was hit by a tsunami, said András Tilscik.

However, he noted that there was a very similar plant close by in the town of Onagawa, which was hit by waves just as tall.

Not only did it survive, but "some people reported that that was the safest place for them to go," he told The Current's Anna Maria Tremonti.

"One thing that plant did differently was that in advance of the tsunami … they had built taller sea walls to protect themselves," said Tilscik, a professor at the University of Toronto's Rotman School of Business. 

We make judgments like that in every walk of life, he said, "whether we are trying to decide how early do I need to leave for the airport to make my flight, or how should my seawall need to be, to resist or withstand the power of that tsunami."

Huge Oscar mix-up: wrong best picture announced

8 years ago
Duration 2:49
Presenters Beatty and Dunaway given incorrect envelope by mistake

Along with his co-author Chris Clearfield, Tilscik has written Meltdown: Why Our Systems Fail and What We Can Do About It. The book looks at catastrophic system failures — from the Deepwater Horizon oil spill, to announcing the wrong winner at the Oscars — and what we can do to prevent them.

For Clearfield, Thanksgiving dinner is a classic example.

"The turkey has to cook before you can make the gravy … and the stuffing has to go inside the turkey," he told Tremonti. "You have all of these connections that just don't happen when you have a simpler meal, like spaghetti and meatballs."

The tsunami destroyed or partially destroyed four nuclear reactors at the Fukushima Daiichi power plant in 2011, but a nearby plant survived largely unscathed. (Toru Hanai/The Associated Press)

Complexity can lead to chaos

When organizations build safeguards into a system, they often add complexity that can do more harm than good, the authors say.

By the time they realize that there's a kink in the system, Tilcsik says it can often be too late.

"Once the genie is out of the bottle it's hard to put it back in, it's hard to fix problems as they arise," he said, adding that small errors can then quickly snowball into something much bigger.  

The mistake they're making, they said, is creating "tight" systems that have no slack, or flexibility for correction.

In 2017, the wrong film was announced for the Oscar for best picture, after the presenters were given the wrong envelope.

The accounting firm in charge of looking after the awards results wanted to make sure that there were two briefcases with identical envelopes in case one of them got lost. 

"And so they had essentially a set of backup envelopes, a safety system, a safety feature," Tilcsik explained.

"But as a result of adding more complexity to the system ... there was this extra envelope, which was then picked up, and then the wrong movie was announced."

The 'pre-mortem'

One method to avoid these kind of failures is the "pre-mortem."

"It involves imagining that a failure has already happened," Tilcsik said, "and then thinking about the risks, or the factors that might have led to it."

Normally, we look at a situation and try to imagine what might go wrong, he added. But by imagining that it already has, we can work backwards.

"That changes the social dynamics," he said. "It empowers people to talk about concerns and risks that they are thinking about."

Clearfield, who is also a commercial pilot, highlights the aviation industry as an example where complexity wasn't increasing safety, until organizations began incorporating the pre-mortem technique.

Organizational behaviour has changed, he said, by obsessively paying attention to small failures before they become catastrophically big ones.

Clearfield adds that the approach relies on workers to speak up when they see errors, and for people in power to take those complaints legitimately, and learn from mistakes.

"We need to change our culture so that speaking up isn't a heroic act anymore," he said. 

Listen to the full conversation near the top of this page.


Written by Bethlehem Mariam, with files from Padraig Moran. This segment was produced by The Current's Alison Masemann.