News...
                        sponsored by

Technology's disasters share long trail of hubris

Send a link to a friend

[July 12, 2010]  WASHINGTON (AP) -- It's all so familiar. A technological disaster, then a presidential commission examining what went wrong. And ultimately a discovery that while technology marches on, concern for safety lags. Technology isn't as foolproof as it seemed.

Space shuttles shatter. Bridges buckle. Hotel walkways collapse. Levees fail. An offshore oil rig explodes, creating the biggest offshore oil spill in U.S. history.

The common thread -- which the new presidential oil spill commission will be looking for -- often is technological arrogance and hubris. It's the belief by those in charge that they're the experts, that they know what they're doing is safe. Add to that the human weaknesses of avoidance, greed and sloppiness, say academics who study disasters.

Even before the oil spill commission holds its first meeting Monday in New Orleans, panel co-chairman William Reilly couldn't help but point out something he's already noticed.

The technology to clean up after an oil spill "is primitive," Reilly said. "It's wholly disproportionate to the tremendous technological advances that have allowed deepwater drilling to go forward. It just hasn't kept pace."

Then he added that government regulation also hasn't kept pace. And something else hasn't kept up either, Reilly said: how the oil industry assesses and works with the risk of catastrophic damage from spills.

Cutting-edge technology often works flawlessly. People are amazed. At first, everyone worries about risk. Then people get lulled into complacency by success and they forget that they are operating on the edge, say experts who study disasters. Corners get cut, problems ignored. Then boom.

Technological disasters, like the BP oil spill, follow a well-worn "trail of tears," said Bob Bea, a University of California Berkeley engineering professor who has studied 630 disasters of all types. Bea is also an expert on offshore drilling and is consulting with the presidential commission.

Bea categorizes disasters into four groups. One such group is when an organization simply ignores warning signs through overconfidence and incompetence. He thinks the BP spill falls into that category. Bea pointed to congressional testimony that BP ignored problems with a dead battery, leaky cement job and loose hydraulic fittings.

It's that type of root cause -- not the equipment failure alone -- that the oil spill commission will focus on, including looking at the corporate and regulatory "culture" that led to bad decisions, Reilly said.

Disasters don't happen because of "an evil empire," Bea said. "It's hubris, arrogance and indolence."

And disasters will keep on happening. In the future, watch out for problems with the U.S. power grid, Sacramento levee failures, flood protection problems along coastal cities and even some of the newest high-tech airplanes, said Rutgers University professor Lee Clarke, author of the book "Worst Cases."

"There's nothing safe out there," said Yale University professor Charles Perrow, author of the book "Normal Accidents." "We like to pretend there is and argue afterward, 'That's why we took the risks because it hadn't failed before.'"

Technological improvements have gradually led to more daring offshore drilling attempts.

"It kind of creeps up on you," Energy Secretary Steven Chu said in an interview with The Associated Press. Then suddenly you realize that now only robots can do what people used to do because the drilling is so deep, he said.

Clarke put it this way: "We've been doing this every day, every year, week in, week out, so next week when we go to 5,000 feet, it will be like last week when we went to 300 feet," Clarke said. "It's just the arrogant presumption that you have got the thing under control, whatever the thing is. In this case, it's drilling beyond your depth."

[to top of second column]

Paul Fischbeck, a professor of decision sciences at Carnegie Mellon University, said the existence of a blowout preventer -- a final backup system which in this case didn't work -- often encourages people to take extra risks.

But the oil industry was so confident in its safety that it used to brag when compared to another high-tech gold standard: NASA.

"They looked more successful than NASA," said Rice University oil industry scholar Amy Myers Jaffe. "They had less mechanical failures."

The oil rig explosion "reminds me an awful lot of the NASA accidents," said Stanford physics professor Douglas Osheroff, who was on the commission that examined the causes of the space shuttle Columbia disaster in 2003.

"Obviously none of these systems are fail-safe," Osheroff said. "People don't spend enough time thinking about what could go wrong."

And because people are so sure of themselves, when they see something go wrong that they can't fix, they accept it, Osheroff said. The Columbia accident investigation board called it "normalization of deviance." Pieces of foam insulation had broken off the shuttle external fuel tank six previous times before that problem proved fatal with Columbia when a piece of foam knocked a deadly hole in a shuttle wing. Hot gas had singed "O" rings in space shuttle boosters well before the problem led Challenger to explode at launch in 1986.

Yale's Perrow pointed to NASA's shuttles and another BP disaster -- the 2005 Texas City refinery explosion that killed 15 people -- as cases of simply ignoring "heavy warnings" from experts.

When the U.S. Chemical Safety and Hazards Investigation Board looked into the 2005 refinery fire it noted that BP had the same problems with "safety culture" that NASA had before Columbia.

"The Texas City disaster was caused by organizational and safety deficiencies at all levels of the BP Corporation," the board's final report said. "Warning signs of a possible disaster were present for several years, but company officials did not intervene effectively to prevent it."

There have been times when warnings of disaster are heeded. The Y2K computer bug is noteworthy for prevention, Clarke said. Many people scoffed and criticized the government for making such a big deal of something that turned out to be a fizzle. But that's because of all the effort to prevent the disaster, Clarke said. It worked.

Unfortunately, safety costs money, so it's usually not a priority, Clarke said. Most of the time "you can't get anybody to listen," he said. "We're very reactive about disasters in the United States."

People don't think about them until afterward, he said, and then they say: "You should have seen that coming."

[Associated Press; By SETH BORENSTEIN]

Copyright 2010 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

< Top Stories index

Back to top


 

News | Sports | Business | Rural Review | Teaching & Learning | Home and Family | Tourism | Obituaries

Community | Perspectives | Law & Courts | Leisure Time | Spiritual Life | Health & Fitness | Teen Scene
Calendar | Letters to the Editor