|
This Month's Theme is Common FMEA Mistakes
Next month's theme will be using FMEA quality objectives to audit FMEAs
Every month in FMEA Corner, join Carl Carlson, a noted expert in the field of FMEAs and facilitation, as he addresses a different FMEA theme (based on his book Effective FMEAs) and also answers your questions.
Questions and answers are a great way to learn about FMEAs, for both experienced and less experienced FMEA practitioners. Please feel free to ask any question about any aspect of FMEAs. Send your questions to Carl.Carlson@EffectiveFMEAs.com, and your contact information will be kept anonymous. All questions will be answered, even if they are not featured in the FMEA Corner.
mis·take [mi·stāk, noun]
The Oxford English Dictionary defines "mistake" as "an action or an opinion that is not correct, or that produces a result that you did not want."
What are the most common FMEA mistakes?
Good judgment comes from experience and experience comes from poor judgment. – Will Rogers
Much can be learned by observing the mistakes companies have made in doing FMEAs. Based on the experience of over 2,000 FMEAs and working with many companies in a wide variety of applications, the following are common FMEA mistakes.
Mistake No. 1: Design/Process Improvements
A review of FMEA applications across industries shows some FMEAs drive ineffective actions or no action at all. Some design FMEAs drive mostly testing, while some process FMEAs drive mostly controls. Failure of the FMEA to drive product or process improvements is mistake No. 1.
Mistake No. 2: High-Risk Failure Modes
Although organizations define risk using different criteria, failure to address all high-risk failure modes can result in potentially catastrophic problems or lower customer satisfaction. Failure of the FMEA to address all high-risk failure modes is mistake No. 2.
Mistake No. 3: Design Verification or Process Control Plans
Some organizations miss the opportunity to improve their design verification plan (DVP) or process control plan (PCP) based on the failure modes or causes from the FMEA. The result is inadequate product testing or PCPs. Failure of the FMEA to improve test and control plans is mistake No. 3.
Mistake No. 4: Interfaces
Empirical data show at least 50% of field problems can occur at interfaces between parts and subsystems or between the system and environment. Similarly, many manufacturing or assembly problems occur at the interface between operations or beyond operations, such as while transporting materials, receiving incoming parts or shipping. Some practitioners miss these interfaces. Not including interfaces in design or process FMEAs is mistake No. 4.
Mistake No. 5: Lessons Learned
Some organizations do not provide links between FMEAs and field data (in design FMEAs) or manufacturing data (in process FMEAs). It takes concerted effort to integrate problem resolution databases with the FMEA. A lack of integration can cause serious problems to be repeated. Disconnect between the FMEA and information from the field or plant is mistake No. 5.
Mistake No. 6: Level of Detail
Some FMEAs are too detailed in their analysis, which makes it difficult to focus on areas of higher risk. Some FMEAs aren't detailed enough, which makes it difficult to determine the root cause and effective corrective actions. Having the wrong level of detail in the analysis is mistake No. 6.
Mistake No. 7: Timing
Many organizations conduct FMEAs late, and this reduces their effectiveness. FMEAs should be completed according to design or process freeze dates in line with the product development process. Performing FMEAs late is mistake No. 7.
Mistake No. 8: Team
Some FMEA teams do not have the right experts on their core teams. Some FMEA team members just sit in their chairs if they show up at all and don't contribute to team synergy. FMEAs having inadequate team composition and participation is mistake No. 8.
Mistake No. 9: Documentation
There are hundreds of ways to do FMEAs wrong. Some organizations do not encourage or control proper FMEA methods. Or they copy old FMEAs and don't adequately address changes, such as new technology or new applications. Training, coaching and reviews are necessary for success. Use of improper FMEA procedures is mistake No. 9.
Mistake No. 10: Time Use
Some organizations mandate FMEAs, but that doesn't ensure the time spent on them is productive. Pre-work must be completed, meetings must be productive and high-risk issues must be resolved. Ask the FMEA team whether their time was well spent, and take action to address shortcomings. Inefficient use of time is mistake No. 10.
What are the FMEA quality objectives?
Each of the common FMEA mistakes can be converted into corresponding quality objectives. For example, the first mistake is "failure to drive product or process improvements." By rewording this mistake as a quality objective, it becomes "The FMEA drives design improvements (design FMEA) or manufacturing or assembly process improvements (process FMEA) as the primary objective."
Given below are the ten FMEA quality objectives that correspond to the ten FMEA mistakes.
- The FMEA drives design improvements (design FMEA) or manufacturing or assembly process improvements (process FMEA) as the primary objective.
- The FMEA addresses all high-risk failure modes, with effective and executable action plans.
- The DVP considers the failure modes from the design FMEA. The PCP considers the failure modes from the process FMEA.
- The scope of the design FMEA includes interface failure modes in both FMEA block diagram and analysis. The scope of the process FMEA includes inter-operation failure modes, such as transfer devices, and incoming parts and shipping, in both process flow diagram and analysis.
- The FMEA considers all major lessons learned (from in-service warranties, customer service databases, recall campaigns, prior manufacturing or assembly problems and others) as inputs to failure mode identification.
- The FMEA provides the correct level of detail to get to root causes and effective actions.
- The FMEA is completed during the window of opportunity from where it can most effectively affect the product design or manufacturing process.
- The right people, adequately trained in the procedure, participate on the FMEA team throughout the analysis.
- The FMEA document is completely filled out by the book, including actions taken and final risk assessment.
- Time spent by members of the FMEA team is an effective and efficient use of time with a value-added result.
How can FMEA quality objectives be used to improve FMEAs?
The FMEA quality objectives should be integrated into FMEA team training and reviewed at each stage of FMEA project completion. FMEAs should not be considered complete until all of the quality objectives have been met. They are an essential part of quality audits.
The theme of next month’s FMEA Corner will be "Using FMEA quality objectives to audit FMEAs." In that article, we'll show how to record the progress in meeting the quality objectives in your FMEA, using the "Analysis Plan" feature in Xfmea.
FMEA Tip of the Month
Each of the ten FMEA quality objectives have a corresponding "how to audit" recommendation. FMEA practitioners can review the status of FMEAs at the end of team meetings to ensure the quality objectives are being met. The "how to audit" recommendations for each quality objective can be found in chapter 9 of Effective FMEAs, or by reading the July 2012 Hot Topics article.
Something
I’ve always wanted to know about FMEAs
The important thing is not to stop questioning. - Albert Einstein
A HotWire reader submitted the following question to Carl Carlson. To submit your own question about any aspect of FMEA theory or application, e-mail Carl at Carl.Carlson@EffectiveFMEAs.com.
A question about occurrence and detection: How are they linked, or not linked?
To detect a cause or failure mode, a test was performed with a high quality on detection. From the test results, the engineers claim that the occurrence must be much lower than initially ranked. However, there were no proper prevention measures taken.
I allowed it because there was evidence that the occurrence must be low.
I am faced with this a couple of times when ranking occurrence. The low occurrence ranking is based on a preventions measure taken, along with test evidence that the occurrence must be low. Of course the test quality (I see this as the detection ranking) must be good enough, let's say 3 or less. What are your thoughts?
Carl: Regarding linkage of occurrence and detection, I'll begin with excerpts from SAE J1739 2009.
"The occurrence ranking number has a relative meaning rather than an absolute value and is determined without regard to severity or detection."
"Detection is a relative ranking, within the scope of the FMEA, and is determined without regard to severity or occurrence."
The essence of my answer to your question about occurrence-detection linkage has to do with the timing of FMEAs, and the process of updating an FMEA. Ideally, Design FMEAs are completed before testing begins. The reason for this has to do with the primary objective of the FMEA, which is to improve the product design. Another objective of a Design FMEA is to identify test deficiencies and recommend modifications to the tests before testing commences. The earlier the design is modified, the lower the cost and timing impact of product development. And, of course, test improvements should be made before testing begins.
Having said that, FMEAs are often updated with subsequent test and field data. In the case of test or field information that indicates a different value for the occurrence ranking, the occurrence value can be updated. The same is true for detection risk. Modified tests can provide an update to the detection-type controls and a corresponding update to the detection ranking. However, before updating, I would suggest making a record of the original FMEA to document the due care in product development, such as by using the "restore point" feature available in Xfmea.
About the Author
Carl S. Carlson is a consultant and instructor in the areas of FMEA, reliability program planning and other reliability engineering disciplines. He has 35 years of experience in reliability testing, engineering and management positions, and is currently supporting clients from a wide variety of industries, including clients of HBM Prenscia. Previously, he worked at General Motors, most recently senior manager for the Advanced Reliability Group. His responsibilities included FMEAs for North American operations, developing and implementing advanced reliability methods and managing teams of reliability engineers. Previous to General Motors, he worked as a Research and Development Engineer for Litton Systems, Inertial Navigation Division. Mr. Carlson co-chaired the cross-industry team that developed the commercial FMEA standard (SAE J1739, 2002 version), participated in the development of SAE JA 1000/1 Reliability Program Standard Implementation Guide, served for five years as Vice Chair for the SAE's G-11 Reliability Division and was a four-year member of the Reliability and Maintainability Symposium (RAMS) Advisory Board. He holds a B.S. in Mechanical Engineering from the University of Michigan and completed the 2-course Reliability Engineering sequence from the University of Maryland's Masters in Reliability Engineering program. He is a Senior Member of ASQ and a Certified Reliability Engineer.
Selected material for
FMEA Corner articles is excerpted from the book Effective FMEAs, published by John Wiley & Sons, ©2012.
Information about the book Effective FMEAs, along with useful FMEA aids, links and checklists can be found on www.effectivefmeas.com.
Carl Carlson can be reached at carl.carlson@effectivefmeas.com.