Learning from Firestone

 We've all heard about the problems that Firestone has had. (This was written in 2000. One of the big stories of the time was that some Firestone tires were catastrophically failing and Firestone knew there was the potential of it happening.) Apparently, they ran quality tests on some tires that indicated the tires weren't quite up to specifications, but they decided to sell them anyway.  Those tires have been linked with fatal acci­dents, and now Firestone is in BIG trouble.

I'm far from the inner circle of Firestone, and, therefore, don't know exactly what happened.  In fact, I'm so far from the inner circle that I've never been in a Firestone plant; however, from what I've seen, I can certainly theorize what might have happened that led them to their problems. 

While the press doesn't seem to understand why Firestone would ignore those quality test results, I can.  I may not agree with their decision, but I can understand the possible thinking that was behind it. Why I can understand is because at times I hear the ration­ales used in foundries to ship when less than desired test results were found. 

 There are many rationales that lead to ignoring failed test results. The most obvious of these is the inexactness of the test in question.  There isn't a quality test in existence that has not been messed up at some time by the people performing it.  The frequency of the problems, of course, varies with the test in question, but incorrect results do happen.  It becomes a very easy decision to merely rerun the test that fails in order to see if the first test was run properly.  If the retest passes, the world is beautiful!  If it doesn't, maybe they screwed up running the test again.

There's a fallacy in the logic that allows us to do this.  What gives us the right to assume that the correct test is the one that passes?  It's humorous that in most operations the only time there is a retest it is when a specification isn't met.  The results may be unrealistic, but if it passes, everything's okay. 

Before anyone thinks that I'm not aware of what is allowed, I will acknowledge that many specifications allow retesting for failures.  (The specifications don't address unrealistic results when they show passing results.)

While the specifications do, at times, support the rationale of the retesting, some of the other rationales applied to failed tests are unsupported except in the mind of the person using it.  One rationale that is particularly offensive to me is "They really don't need it that good anyway." The egotism that the producer of the part knows better than the customer what the customer wants or needs is beyond my belief.

I can see Firestone personnel saying, "This test is too rigorous, nobody would drive a car like that; therefore, the tires will be fine as they are." Is that what happened at Firestone? I don't know, but I can picture it happening. I can picture it because I've seen foundryman do the same thing. The test results indicate the metal didn't meet specification, but the decision to ship is made because the customer doesn't really need it that strong. I can't imagine a foundryman being ready to accept a molding machine that doesn't have all the features that they ordered because the manufacturer didn't think they needed them. Yet, a few foundrymen are willing to do that to their customers. Unfortunately, it usually works. The castings are used for years without being detected by the customer. However, just as with Firestone, when it doesn't work, the costs are very high.

There=s also the possibility the customer may have had some contribution to the problem. The customer=s buyer is on the hot seat because their production department needs parts. The supplier tells the buyer about the problem with the test, and the buyer responds by relating how badly they need the parts. He may just tell them to forget about the test results and ship the parts. It's far more likely that he might ask whether everyone's sure the test was run correctly. Then he might talk about his loss of faith in the supplier's ability to produce and insinuate that the next job may go to a competitor that can produce on time. The supplier then decides to ship.

That may not be what happened with Firestone, but I"ve seen it happen. Obviously, if the customer wants a foundry to ship castings that don"t meet specifications, the foundry should ship. After all the customer is the customer and knows what is needed. They should be shipped IF the customer is willing to put it in writing. If it isn"t in writing and something goes bad, the foundry will have to pay the penalty.

Of course, the biggest factor in the Firestone fiasco simply may have been ego. The management may have simply been unwilling to admit they couldn't produce to the specifications. Again, I emphasize that I don't know exactly what happened, but I've seen it happen in foundries. Most people hate to admit they are wrong. "I know these are good, no matter what the test says. Ship them."

So the next time you're considering shipping castings that don=t meet specifications, think about how Firestone handled the situation and the results they obtained.

 Who said a small foundry can't learn anything from larger operations?

Return to Foundry Essays Page