Print This Page


83rd GM Speaker Chiles

Print Date: 3/28/2024 9:35:39 AM

The 83rd General Meeting Feature Presentation
James R. Chiles
 
“Secrets of the Code”
 
The following presentation was delivered at the 83rd General Meeting Monday morning session, May 12, by James R. Chiles. It has been edited for content and phrasing.
 
Mr. Chiles has written extensively about technology and history since 1979. What distinguishes him from other technology and history writers is his approach to research. His access to tragedy sites has provided readers with profound insight into the internal dynamics of disaster. Mr. Chiles is the author of Inviting Disaster: Lessons from the Edge of Technology. He also authored, The God Machine: From Boomerangs to Blackhawks, The Story of the Helicopter. He is a regular contributor to the National Board BULLETIN and his work has appeared in Smithsonian, Air and Space, and Aviation Week. Additionally, Mr. Chiles has served as commentator for a television series based on his book, Inviting Disaster. He's also appeared on History Channel programs such as Titanic at 100, Katrina, An American Catastrophe, Engineering Disasters, Life After People, Wild West Tech and Mega Disasters. He also appeared on the National Geographic series Seconds from Disaster.
 
Mr. Chiles' slide presentation can be accessed here.
 
 MR. CHILES:
 
I’m going to talk a little bit about history. I'm a color commentator, you might say, on the technology and history side, and some on the people side. I like to wait 10 or 20 years after an incident to talk to people who were part of something critical. Brian Mehler was the man who saved Three Mile Island when they were 30 minutes from disaster back in 1979. I talked to him 20 years after that incident. People have time to reflect. In many ways the stories are timeless and the patterns repeat. 
 
Chances are the people behind 85 percent of accidents are nothing at all like villains. They are nice people who were heedless. Tom Sawina was a nice person who was the driver in a crash in a Holidazzle Parade in Minneapolis in 1998. Really a fine fellow, but he was at the wheel when the van had a sudden acceleration incident and killed two people, pinning one girl against a window. He had to live with that, and still does. That's a burden nobody wants to carry. And the better person you are, the heavier the burden. You wake up every day and think about that person you could have saved; you could have done something, just that little extra bit, but it didn't happen.
 
Revere Wells was a caretaker. Caretakers are some of the most important people because they see things, they go high, they go low, they go way down deep, just like you folks go into boilers. Revere Wells was the caretaker at the Baldwin Hills Reservoir in 1963, and he listened for a very still little thing that he knew would be a problem, and that was a leak. Because it was an earth dam, he knew to act immediately. He called the alarm and the police, and three hours later the dam failed. Only five people died because he was alert. He knew his world; he knew what the risk factors were. We can hope there will be more people like Revere Wells who aren't just filling a job slot. They know their world, they know the risks, and they know the little bitty signs of when something terrible is going wrong.
 
Because “secrets” is my theme today, I want to talk about the code world. I had the wonderful opportunity to talk to the family and coworkers of the world's premier code breaker, William Friedman, for an article. He was one of the key code breakers in World War II. I was reading about him and became interested in the code world, and discovered some neat terms in the code world, and one of them is the term busts.
 
A bust is an opening into that secret world and is often very, very subtle. It’s a little bitty thing that opens up a great big secret. One was the German capital ship called the Magdeburg that went down in August of 1914 and opened up all kinds of codes to the Allies, and in fact, one of those codes was used quite close to the ASME headquarters in 1914 through 1917, just down 7th Avenue in New York City.
 
It's a rather dramatic world. And what saves a nonfiction writer like me from being in the most boring category is the material I am involved in, which I find very fascinating. There are constant surprises and I search out the story that's never been told.
 
Another famous bust came from Alexander Butterfield. He changed history during the Watergate hearings when he said, “Oh, by the way, did you know that President Nixon has made a tape of all the conversations in the White House?” That statement changed history. It forced Nixon out of office; a little bitty thing that opened up a much bigger secret world.
 
Let’s talk about mistakes and failures. There are so many lessons to be learned from failures, and I'm continually amazed that there are hundreds more success books than there are failure books. To me, the most interesting stories are the failures, particularly from people who learn from them, who pass the message on, who fess up. One of the most dramatic presentations I have heard was from the engineer who was responsible for the Hyatt Regency walkway collapse in Kansas City. Part of the way he paid his debt was to go around and talk about how his decisions as a structural engineering played into that terrible event, something he carries with him all the time.
 
Yogi Bera, one of my favorite guys, said, “We made too many wrong mistakes.” He was talking about the 1960 World Series when the Yankees lost to the Pirates. So my question is: can there be right mistakes? The answer is yes. To my mind, a right mistake is one that stops short of causing huge damage, but it creates a great opportunity to learn. Structural engineers will be familiar with the letters SRMF – steel-reinforced moment frame – which proved to be quite a problem after the Northridge earthquake in 1994. But we learned from it. There was a major redesign after 1994 due to that earthquake, and hopefully when the big one hits L.A., the steel frames will be a lot stronger than they were in Northridge. That's a right mistake, meaning we learn from it in time.
 
Unfortunately, there are a lot of wrong mistakes and patterns. I see a whole lot of cutting costs and corners. Since my book came out in 2001, there have been all too many examples of this. West Fertilizer comes to mind, as does the term grandfathering, which is a loophole. West Fertilizer had a nice big loophole. Twenty-five hundred tons of ammonia nitrate didn't need particularly any fire precautions at all. They could use wooden bins; they didn't have to have a sprinkler; they pretty much didn't have to do anything. The county was prohibited, or at least they thought, from having a fire code. Well, that grandfathering killed 15 people in West Texas last April.
 
I give the example of the Comair Flight 5191. In a CNN documentary, Sole Survivor, the co-pilot, the only person to survive the Comair crash in Lexington, Kentucky, was interviewed. That’s quite a heavy burden for him to carry. He and the pilot were not bad people. They were just not taking their jobs seriously enough: talking about idle things in the cockpit rather than paying attention. They were about to take off on a closed runway and it was sheer recklessness.
 
Another example of recklessness is the Costa Concordia accident. The court in Grosseto has not rendered its finding, but if I ever had to think of a bored operator who took chances to amuse himself, it would be Francesco Schettino.
 
Think of all the operators who are texting and driving a train. Clearly, it's easy to blame those people; but what I ask is, who hired them? Who supervised them? Who was supposed to be paying attention? These behaviors do not come out of the blue. And you will see them. In my book Inviting Disaster, I talk about locomotive engineers and boiler operators in the 1850s who were known as very reckless people. They were reckless long before the things blew up. So I would like to know who hired and supervised them, and why didn't they act years before these events happened? What was wrong with the system?
 
Let’s talk about new mistakes. The old mistakes repeat themselves, but there are some new ones coming along. Some of you may be in the computer world, and this is just a little heads-up. One new mistake is in mode confusion, where people don't realize that the machine is in a different mode. That happened a lot with fly-by-wire airliners when they first came out. Do you remember the Seastreak ferry accident in New York Harbor that occurred in 2013? The captain was not a bad guy, but he had switched the vessel to backup mode to correct a problem, and forgot to change it back to normal operating mode as the ferry approached the dock. Eighty people were injured, four of them critically. Mode confusion is not unusual in our new world of technology.
 
Let me take you back to 1911 and 1914 so that you understand the world the American Society of Mechanical Engineers (ASME) lived in. New York was a fascinating, violent, and dangerous place in the 1910s. The war was coming on as a code was being drafted and then put into action in 1915. The Germans were busy at work stealing secrets from the British; the British were stealing secrets from the Germans. It was very much like Casablanca. There was a train that went down 11th Avenue in Manhattan right along the Hudson River that basically ran over anybody in the way. They had cowboys moving cows out of the way because there were slaughterhouses. It was a fascinating, dangerous time very much like the Wild West. I call it the Wild East, and this is where such a noble thing came out of.
 
In 1907, Massachusetts got the codes and standards ball rolling, but it was interesting that the ASME committee in 1911 through 1914 realized it was not going to be as easy as they thought. A fellow named John Clinton Parker fought a one-man war against ASME for close to 30 years. This guy was determined. John Clinton Parker invented a down-flow boiler. Starting in 1914, he wrote angry letters to ASME challenging them not to do the Boiler Code because he had a company called Parker Boiler in Philadelphia, and he didn't like the code. He felt it was going to close down his style. So he started constant rear-guard action in 1914. He even put false stamps on his boilers.
 
One of the reasons he was so determined and he continued fighting all the way through the 1930s until his death in 1946 was a publication called Lefax that he felt competed with the engineering edicts from ASME. He basically never gave up. The ASME people were put through the ringer by John Clinton Parker. He was a rebel, he was obnoxious, he almost shut down ASME, he sued them, and they probably would have throttled him if they had gotten a chance to get their arms around his neck, but he did point out certain problems with ASME governance, and he also got a Cresson Medal from the Franklin Institute in 1904. So he did know boilers pretty well. But he was a rebel and you might say he had a certain truth to speak, but it was not in his favor that it was drowned out by angry words. But there was some truth there and, in fact, there were changes in ASME governance due to all the problems that John Clinton Parker made for ASME.
 
 
When it comes to standards and codes, I strongly recommend you listen to the user groups even if they are modern-day John Clinton Parkers, even if they are a bit annoying, because they will tell you what is happening on the fringe, and it may or may not be important. But keep in mind what's happening on the fringe, because some of that will eventually move to the center. Make those mistakes right mistakes and not the wrong ones that you will regret the rest of your life.
 
I would also recommend to lawmakers to be very careful about variances and loopholes. Grandfathering means in the past, a variance or an exception goes into the future, and it can be very hard to remove later. Treat those very carefully. Don't do it very often. And I would recommend anybody with a variance going into the future – and some of you may know recent cases – to make it higher scrutiny, not lower scrutiny; perform extra inspections on a more frequent schedule. Then it won't be as much of an incentive to go for that variance, and you may learn a lot.
 
Your work as inspectors and engineers and keepers of codes and standards insists that you are thinking ahead and processing ambiguous information and making tough decisions. It’s not an easy job. You have to make judgment calls that sometimes are pretty tough to make.
 
I thank you. I keep a blog, Disaster-wise, and will post when I'm on television next time. I always welcome your suggestions about good topics to look into and emerging issues.