Cold Iron & Cool Heads – Hard-Won Lessons from the Machine Frontier for the Next Generation
The following presentation was delivered at the 80th General Meeting Monday afternoon session, May 9rd, by author James R. Chiles. It has been edited for content and phrasing. A slide presentation of his address can be accessed here.
James R. Chiles has written about technology and history since 1979. His access to tragedy sites has provided readers with a profound insight into the internal dynamics of disaster. He is the author of Inviting Disaster: Lessons from the Edge of Technology, and The God Machine: From Boomerangs to Black Hawks - The Story of the Helicopter. Mr. Chiles' work has appeared in various publications, including Smithsonian, Air & Space, and Invention & Technology. He has also appeared on History Channel programs and on National Geographic. Additionally, he was featured in the winter issue of the National Board BULLETIN.
Mr. CHILES: When people come to Vegas, it probably triggers certain memories. When I was listening to Joe Montana, his comments made a few links to the field of technology and business. When he mentioned the DC8, it triggered the fact that it was the only conventional airliner ever to be taken to supersonic speed. A pilot took it fifty-one thousand feet over Edwards Air Force Base into a power curve, and tested it almost to destruction.
Las Vegas triggers another memory. It’s only twelve miles from Henderson, Nevada, where forty-five hundred tons of ammonium perchlorate blew up. It was a side-effect of The Challenger disaster. There was so much of this highly-explosive material kept at the PEPCON facility, it actually made the buildings shake in Las Vegas.
When I was invited to speak with you folks, it meant a great deal to me because you are on the front lines. I’m a writer who is pretty far from the front lines, except when I get a rare opportunity to join in. So I thought I had better go and learn from you. Late last year I visited the National Board staff in Columbus, and I went through old issues of the BULLETIN. I found some great material in the early publications.
In fact, I came away with a poem. It's a mix of technical information and human nature, and what people look for; what drives them. This is from July 1945. There was a discussion about the boiler code and the Adamson-type, highly detailed, what to do under certain problems of an unstayed circular furnace, and then there was this poem, An Ode to Inspectors. It was unnamed:
Ode to Inspectors:
I shall climb the golden ladder
To those pearly gates on high.
I shall hunt up old St. Peter,
Watch the twinkle in his eye.
And when I show my printed forms,
And my voice shows much inflection,
I will say to him, I have come, Old Saint,
To make the annual inspection.
The policy expires quite soon,
And we must complete our rating.
The same I'm sure will start to cuss,
Slam the gates and strain the grating.
What if it does, will grieve Old Pete,
We are tired of all of these inspectors.
A man last week inspected us,
You should be called rejectors.
The angels must wear goggles,
And you want the halos braced,
Handrails on the golden stairs,
And first aid kits be placed.
Go to hell, will rave Old Pete,
We are too busy here today.
Hell needs you worse than we do.
Now you get out, and get out to stay.
I came away from my time at National Board with a big stack of research. One article, which I will refer to later when I talk about red flag issues, was by a man named Phil Corbett. The article told some wonderful stories about what sort of things to watch out for with boilers.
Travels on the Machine Frontier
I had a rare opportunity to go with Pablo Lopez of Mueser Rutledge Consulting Engineers into World Trade Center Six, which is called The Customs House, and see firsthand what kind of havoc is wreaked when technology is unleashed. This was before the building had been torn down; firefighters were still looking for their lost brothers.
It was a revelation to see what happens when fifty thousand tons of steel goes through the center of a building. In some respects it’s like a locomotive shearing everything in its path and piling up an impenetrable block, in the same way locomotives in old days would meet in a tunnel and create a solid mass of steel and iron.
So a question: are we going backwards and looking with 20/20 vision? I use the analogy of the Lutine Bell. The Lutine was a French ship taken over by the British. It sunk in 1799 with a hundred million dollars of gold on board. The British government had put all their eggs in one basket, because it was really important not to lose the gold. Sure enough, the ship hit a reef, went down, and Lloyd's of London had to pay for it. It had such a huge impact on Lloyd's of London that they took the bell off – they didn't get the gold, but they took the bell off. And when it rings once in London, it means a ship is in trouble, it hasn't reported in; when it rings two times, it’s good news.
Hopefully, a writer (even a generalist like me), can do more than just ring a bell for good news or bad news. But I want to emphasize good news – two bells. So often in this domain of technological disasters, it's the bad things we hear about. I'm as much interested in what Arthur Conan Doyle called, “the dog that doesn't bark,” as I am the accidents. What is that? It’s the thing that doesn't blow up. You might call it the dog that doesn't blow up – the people who are never in the headlines because they do such a darn good job.
I tried to sell TV producers on this notion – let me talk about the guys who live on the edge of danger and have lessons for the rest of us, such as the people at Dyno Nobel, who deal with explosives. But ah, no, we’ve got to have disasters. But in a book you can write what you want, so that’s what I did.
This is a strange world. It’s exciting, but unfortunately, too exciting. When young people say it's all been done, everything is tamed, I would say, no, in fact, it is not. There are so many strange cases and so many close calls, hundreds for every disaster, that if you look for it, maybe it’s too exciting of a world. Karl Weick had a name for it. He called it vu jade, meaning a place no one has ever been.
Were You There?
I pose a question to you. It's from Job, Chapter 38. God asks Job (who has been complaining), “Were you there when I made the foundations of the world?” It's also a line from the musical Cats: “Were you there when the pharaoh commissioned the Sphinx?” I like that phrase, Were you there? So I will ask you – were you ever there when something happened, or when something might have happened?
Have any of you been in such a close call that you laughed immediately afterwards? It's nervous laughter. I have been there. It's a terrible thing, but that's the reaction: I survived, I made it. But were you there when it could have been prevented? That's the most painful thing of all.
Take the co-pilot on Comair Flight 5191. He's the only survivor of the 2006 crash at Blue Grass Airport near Lexington, Kentucky. He's left in a wheelchair with one leg amputated and with an enormous load on his mind. Everyone else was killed when something happened that could have been prevented. Just before the plane crashed, he said, “Isn’t it odd that the lights are off on Runway 26?”
Jack Gillum and the Hyatt Regency walkway in Kansas City, which collapsed and killed 114 people, also come to mind. Gillum has helped pay the price. He has expiated by going around and giving talks about what it's like being the Engineer of Record when 114 have been killed and many more injured (fifteen hundred people were at risk).
Disaster Reports are Short on Villains
Reporters look for villains; occasionally they look for heroes. Sully Sullenberger – a hero. There are many more that never hit the headlines. Take West Pharmaceutical for example. It's a factory that made rubber fittings for pharmaceutical items like syringes. They ran rubber strips through a bath with a chalk-like compound of water. I won't go into detail other than to say there were five opportunities to notice that the dust was highly flammable. After those five missed opportunities in 2003, something triggered over half an inch of dust that had been left in the overhead ceiling. It led to an explosion that was heard 25 miles away and set the woods on fire a distance of two miles. The Chemical Safety Board made an excellent report on the incident, but you can look through hundreds of pages and not find anybody to get morally angry about. You will find five missed opportunities that could have saved lives and prevented injury.
These things have been happening for over a hundred years. There is really no excuse for people not watching out for potential problems. We are very long on tragically missed opportunities. In my book, I mention the Hubble Space Telescope and the big rush at PerkinElmer to get it ready. Then it sat in storage for eight years without being tested. It was launched and then they found out it was messed up.
People do have insights into this. I mention an example of a fellow named Vanocchio Biringuccio. He was considered the father of foundry work. And he said, “Remember that often the whole depends on some small thing; for example, a poorly made binding or ill-fitted joint, the leaking of mould through a crack” (Pirotechnia, 1540).
It's the little things that should keep you up at night. That's a sign of a responsible individual. And sometimes that's not even enough, I'm sad to say, for somebody like Joe Shea, who was a hero of the Apollo program until January 1967, when three astronauts were killed in a fire. He went from hero to villain because of one little thing that led to a fire in a full oxygen atmosphere. And think of the engineer who was behind the collapse of the Quebec Bridge in 1907, Theodore Cooper. He had been a hero of the St. Louis Eads Bridge in 1860, but at this time he was a man past his prime; he was too tired, too sick to be on site, and the bridge collapsed. He sent a message just before it collapsed to the Phoenix Bridge Company saying, “Shut the job down, I'm very worried about it,” but the job site did not get word in time. The bridge collapsed and killed 75 men.
But the good news is that sharp and alert people do make a big difference, and that makes me optimistic. A fellow named Revere Wells was supposed to keep an eye out for a dam called Baldwin Hills in Los Angeles. He had been taught if a leak starts, get right on it. This is very important, because it's an earth-filled, concrete-based dam, and there are many people living downstream. Even though Wells was a humble inspector – a guy who was just supposed to keep an eye on things, he heard an odd trickling noise that didn't belong there. He immediately got a supervisor; the supervisor immediately checked it out. They put the word out, and only had two hours to evacuate. Five people were killed, but it would have been many hundreds had not Revere Wells and the supervisor understood it doesn't matter your rank – what matters is knowing your world and what could go wrong. Similarly, I mentioned earlier an old BULLETIN article written by Phil Corbett. In it he talked about an inspector who heard a strange sizzling noise, followed up on it, and stopped a boiler explosion.
There are so many cases where somebody who had been either working minimum wage or working at the top could have intervened. An example is the Alexander L. Kielland, a hotel rig in the North Sea that rolled over and killed 123 people. We know that a painter saw a crack in the structure because he painted over it; there was paint within the crack in the fracture, but he didn't say anything or nobody paid attention. Again, it's not your rank; it's how well you know your world and what you do to pass the word along.
Patterns from the Book: System Fractures
System fracture is the notion that weak points in metal will join up given time. In January of 1943 the ship The Schenectady broke up in the Kaiser Swan Island dock. Everybody in port saw it break apart. It was very embarrassing. Even though there had been ten cases before, it suddenly became a big issue; the engineers got on it. The ship was repaired and went on to fight in the war, but it was darn embarrassing for it to break up in the dock after the sea trials. But we learned from it, we learned how to design a ship so that it won’t break in half. It was a combination of several things that caused it to happen. Any type of metal, no matter how high-quality, always has discontinuities, cracks, and weaknesses in it. The notion of metal fracture is that these weaknesses will join up in time if they are allowed to. That's my point. If people sit by and don't pay attention to telltale signs, fractures will eventually join up.
Other people like other models. For instance, the James Reason Swiss cheese model. Some of you may be familiar with it. But this is something I came up with when I was writing the book. All systems have weak points, even the best. The question is, do people intervene in time? It's not the weaknesses; rather, it’s what you might call the strength of intervening in time. I call it a crack-stopping organization. An example: I went to a Cessna Aircraft factory. Cessna has learned – they’ve gotten “religion” – because every so often, when we hear about one of their airplane crashes, it's a lawyer or a doctor on board – a very expensive lawsuit. Cessna has learned not to punish people for making mistakes; they reward them for reporting mistakes. The principle of system fractures is that there is a lot of slack in many systems, and they can go and go until they break. The thing is, you want to catch it before it breaks.
New Problems for New People
Sometimes those who are in a field and the emeritus of the field might think: Will those next folks be able to do as good a job as our generation has done? It reminds me of a quote from the president of Dickinson College. His name is Joe Corson. He said he's worried there is a soft generation coming up that will expect other people to solve their problems. When did he say that? In 1937, just before the Greatest Generation. So in my mind it's not any weakness in a generation. It's what they are called to do; the challenge they rise to meet. Certainly the Greatest Generation rose to meet their challenges.
So here are some challenges our young folks might see: First is a rapid turnover of heavy use of contractors. Certainly you should see that in the refining field. Second: safety practices driven more by models than statistics. Third: re-purposed equipment – equipment that is running longer than they thought it would. Forth: Plants are running remotely via SCADA with no engineers on site. I mentioned this in my interview in the winter 2011 National Board BULLETIN. And fifth: the fact that full-stop intervention is very difficult. It’s easy to hand out a fine, but a full stop? That’s plenty hard. Take the case of the I-35W Mississippi River bridge collapse in 2007. Isn't it strange that there was no time for upkeep, but once it collapsed, there was plenty of time to fix it?
Cool Heads and Cold Iron
Before I get to the last point, here is something that might reassure those who wonder about the new folks and if they have the experience that you do coming in. The man who saved Three Mile Island Unit 2 was not from Rickover's nuclear navy; it was a man called Brian Mehler. He had been trained at the Penn State Research Reactor and came from the Air Force. And when all the Navy guys who had been on duty since 2 a.m. could not figure out what the problem was, Brian Mehler came in from outside, and in fifteen minutes figured out the high likelihood that the pressure operator relief valve had stuck. He closed the block valve, knowing that the ASME code safety valves would not be jeopardized; this was a separate valve.
An outsider, a newbie, figured it out. So what I'm going to work on in the coming months is a subject of red flags and telltales. You have heard a lot in literature about why regulators didn’t see a red flag, and that irritates me because red flags didn't work all that well even in the railway days. If you've ever run a locomotive – I haven't, but I've heard – visibility is so poor, you can look out one window, and the guy on the other side is waving the red flag, and you can't see it because you are looking out the other way, or it might be dark, or there might be rain in your eyes. So, in fact, red flags weren't even used that reliably back in the day. What I am more interested in, because clichés irritate me, is less on the red flag, but more on something called the telltale.
The important thing about the staybolt is that it's got a little hole in it. The hole, if the steel fractures, will send steam and water out the exterior of the staybolt head, and that way you will know you have a problem. It’s a wonderful invention. In fact, steam has many wonderful inventions. The best safety valves I have heard are locomotive safety valves because they had to work so many times and they are so rugged.
The financial industry has done a lot of work on telltales. There is a thing called Altman Z-score, which helps indicate with high reliability a company about to fail. There are ways called the SAS No. 99, which is detection of financial fraud – well-developed techniques for detecting misdeeds.
I will conclude with a hopeful note. When I visited the National Board, I had a chance to visit with Jim McGimpsey and BULLETIN editor Wendy Witherow. And Jim McGimpsey, who has served in Rickover's nuclear Navy, talked about how one time the admiral crawled underneath an air conditioning unit to make sure somebody had dusted behind it. How many admirals would crawl underneath into a spidery, cobwebby area? And he came out and said, “There's too many spiders and cobwebs here, too much dust.”
Why? Because he wanted people to know that only excellence, only the best, only perfection can do. He asked future President Jimmy Carter why he wasn't the best in the Naval Academy. He came up with unique ways of tormenting, you could call it, in interviewing people to find out who had the resiliency to hold up under extreme pressure. And here is the bottom line from Rickover. And I would say the nuclear industry can learn from it; many industries could learn from it. Rickover had the fear that if we have a single meltdown, all the ports in the world will be closed to us. We cannot have a single reactor meltdown. And that was his philosophy: We have got to avoid this from ever happening. And he did. After six thousand reactor years, there has not been a nuclear meltdown in the nuclear navy.
Let me finish with this thought. If there are legislators or county boards wondering why not cut back on regulation, the question should be asked of them: Do you want to go to the funeral of those people? Because you have to be prepared to go to those funerals and be identified as somebody who cut back. That's the test. Thank you very much. Wild locomotives couldn't have kept me away from this opportunity.