Locked lesson.
About this lesson
This final lesson reviews the key principles that must be followed when conducting a DOE study. It highlights the benefit of each and the dangers if the principle is not properly applied.
Exercise files
Download this lesson’s related exercise files.
DOE Keys to Success.docx71.3 KB DOE Keys to Success - Solution.docx
62.6 KB
Quick reference
DOE Keys to Success
These keys to success are based upon some of the most commonly observed pitfalls and problems observed when teams use DOE studies. All have been addressed in this course, but their importance is reinforced in this lesson.
When to use
Most of these keys are tied to the upfront planning of the DOE project and therefore these should be reviewed at the start of each DOE study.
Instructions
When teams or stakeholders determine that they need a DOE, they often rush into the study without carefully considering the items mentioned in these keys. Then when they get to the end of the study, they realize that the results are not clear or that key elements which could have made the study much more powerful were missed. Awareness of these keys and applying them to your study will bias it for success.
Clear Objective
This one should be obvious, but often the idea of a DOE is hatched in a team meeting or hallway conversation and those involved walk away with different ideas of what the DOE will do and the results it will provide. As we have seen there are many different types of DOEs. If one person has only experienced full factorial and another has only experienced Taguchi – they will be starting from very different paradigms. This is commonly an issue with stakeholders because of their different backgrounds, their experience with DOE is quite different.Clarify with the stakeholders and team members what specific question is to be answered, then select the study approach that is best suited to that objective.
Quantitative Response
The DOE needs a quantitative response factor in order to do the statistical analysis, but not just any quantitative factor. It should be tightly correlated to the study objective and it should be one for which a measurement system exists that can accurately measure minor changes in the response factor during or after each test run. An inadequate response factor, or one that has unreliable data, results in statistics that no one trusts or cares about. This wastes all the effort that was put into the DOE.
Study Design Elements
There are many different types of DOEs and even within a DOE, there are many different features that can be used such as two-level factors versus multi-level factors, replicates, center points, randomization, and blocking. These can improve the statistical confidence in the result, but may require more test runs and complicate some aspects of the study. Based upon the study objective and the cost and budget constraints, each of these items need to be considered in order to create a DOE study plan that will provide the most useful information given the real world business constraints.
Iterate to Success
Many DOE studies are conducted in phases. In fact this is an inherent attribute of the typical fractional factorial DOE. A phased approach is used in most organizations today for development work and a characteristic of those phases is that the information from one is used by the next phase to refine their focus and area of activity. That is exactly what we do in a typical DOE study. We start with lots of factors and then narrow them down to a vital few that are studied in depth. Given the pace of business and resource constraints, this is a very appropriate methodology to follow. This also makes sense from a statistical perspective. Sometimes there is an outlier in the data that may impact the results of one phase of the study, but it does not need to impact the entire study if the work is done in phases.
Hints & tips
- Review the keys at the beginning of each DOE to be sure you are applying them.
- Do a "lessons learned" at the end of your study and determine if there are additional keys to success that you want to add for your organization.
- 00:04 Hi, I'm Ray Sheen, and it's now time to wrap up this course.
- 00:08 Let me leave you with a few reminders of the keys to successful Design
- 00:13 of Experiments study.
- 00:15 We started started the course with the emphasis on the study objective,
- 00:18 let's revisit that now in light of what you've learned.
- 00:22 Understanding the objective will directly lead to a decision about the study design
- 00:27 and factor selection.
- 00:28 If you want to screen out minor factors, you want a study design that includes many
- 00:33 factors so that you will be able to identify which ones are significant.
- 00:37 But if you're trying to optimize performance,
- 00:39 the objective points to what the response variable must be, and
- 00:43 the nature of the optimization will often dictate what factors to include.
- 00:46 While if instead the goal is a root cause identification,
- 00:50 a different response factors be chosen, and different control variables.
- 00:54 Or at least different high and low levels on the variables may be selected. In fact,
- 00:59 the objective tells us what type of DOE is needed.
- 01:02 If the objective is performance optimization,
- 01:05 then a fractional factorial is used.
- 01:07 If the objective is to get a complete design space equation,
- 01:10 then a full factorial.
- 01:11 If the objective is robust manufacturing process control
- 01:14 by the operator will then it's Taguchi.
- 01:17 And if the objective is to sift through dozens of potential factors to focus on
- 01:21 the vital few, then use a Plackett-Burman DOE.
- 01:24 Unclear objectives lead to missed expectations on the part
- 01:27 of the stakeholders.
- 01:29 They're expecting one type of answer, and you bring the a different type of answer.
- 01:33 They may be expecting a fast and quick Plackett-Burman or Taguchi study, and
- 01:37 you're bringing them a cost and budget for a full factorial study.
- 01:41 A clear objective avoids this type of confusion, the next point is something we
- 01:45 stated early on, and that's the need for a quantitative response variable.
- 01:50 For some reason, this seems to be a difficult element for stakeholders.
- 01:53 When asked what's the most important attribute to measure, they tell us,
- 01:57 well just make it work right.
- 01:59 I hope you can see by now that we need a measurable
- 02:01 value in order to calculate the design space equation,
- 02:05 and all of our graphs and charts are based upon that equation.
- 02:09 A pass-fail objective can’t be analyzed unless they can give
- 02:13 us a specific pass-fail threshold in a quantitative factor.
- 02:17 Of course,
- 02:18 the response factor should directly correlate with the study objective.
- 02:21 So the that the design space equation essentially is an equation for
- 02:25 the study objective.
- 02:27 So a study for
- 02:28 reliability will be measuring the failure rate as the response variable.
- 02:32 Another challenge with obtaining accurate variable response factor data,
- 02:36 is the measuring system. Don't assume it's capable of giving you reliable data.
- 02:41 I had one client who had spent months conducting analysis about an optical
- 02:45 alignment problem with one of their products,
- 02:47 including conducting a DOE to identify the most significant factors.
- 02:52 With inconsistent results and no significant improvement,
- 02:55 I finally convinced them to do a measurement system analysis
- 02:58 on the optical alignment measurement system.
- 03:00 They assured me up until then that everything was just fine.
- 03:03 Well it turned out the system error was nearly ten times larger than the allowable
- 03:08 span from the minimum to the maximum of their tolerance on the response factor.
- 03:12 All that data that they had been collecting was worthless, and
- 03:16 they had to invest in a new optical alignment measuring system
- 03:19 before they could proceed.
- 03:21 If you think back to the design space equation that was generated, it started by
- 03:25 taking an average of the output when the control factors were either plus 1 or
- 03:29 minus 1.
- 03:29 That average is far more meaningful when the response factor
- 03:34 is a variable output and not a zero or one pass-fail answer.
- 03:39 We talked about study design elements early in this course,
- 03:41 it's worthwhile now to revisit some of them again.
- 03:44 When structuring your DOE consider the different design elements,
- 03:48 they'll help with the accuracy and confidence that you have in your study and
- 03:52 the design space equation.
- 03:54 Replication reduces the impact of noise, and by that I mean,
- 03:58 statistical noise not the Taguchi DOE noise.
- 04:01 If you know that there are many factors that you can't control,
- 04:04 you should add replicates to accommodate their likely impact.
- 04:08 Also more replicates will increase the confidence in your statistical analysis.
- 04:12 Remember in those design space equations,
- 04:15 the factors are calculated by using the average of the high and low factors.
- 04:19 Well, once you get up to the point where you have at least 15 or 20 high and
- 04:23 lows of each factor, your statistical confidence in that average is quite high.
- 04:28 And you can add data points by adding replicates, another concern could be that
- 04:32 the experimental test process changes some as the operators are conducting the test.
- 04:37 They may not be consciously changing things,
- 04:39 but repetition creates habits which can influence how the tests are performed.
- 04:43 The use of center points and randomization of the test runs will both tend to
- 04:48 minimize the effect of drift in the process.
- 04:51 If you have a very complex test process, or one that takes a long time,
- 04:55 consider the use of center points.
- 04:58 And blocking was a way to accommodate real world constraints on the test process.
- 05:03 If there's a logical reason why the testing must be done in different groups
- 05:08 like different locations or timing or
- 05:10 equipment, then use blocking, otherwise don't.
- 05:14 The last point to make about the study design goes back to
- 05:17 the resolution number in confounding or aliasing of interaction effects.
- 05:22 Be sure that you understand the level of confounding and
- 05:25 that it fits within the study objective, this will many times dictate the level of
- 05:30 fractionality that you're willing to accept.
- 05:33 My final key to success is one that has become a critical success factor in many
- 05:38 design and development processes over the past ten years: iterate to succeed.
- 05:43 The most common type of DOE is fractional factorial which normally runs
- 05:48 in three phases or iterates through screening, refining and optimizing.
- 05:53 One of the reasons it is so popular is that with each iteration the analysis
- 05:57 provides more clarity and accuracy to the design.
- 06:00 Each phase uses lessons learned from the previous phase to focus the
- 06:05 analysis of the next phase.
- 06:07 Another benefit of iterative approach is related to the fact that it is
- 06:11 a statistical analysis, statistics can be screwed up by outliers.
- 06:16 So the screening study just looks for
- 06:18 big effects which will probably be driven by many data points,
- 06:22 then only truly significant points are analyzed in the refining study.
- 06:27 But that usually has more runs and
- 06:29 the impact of an outlier is therefore minimized.
- 06:32 Bottom line is that you can have confidence in the statistical data and
- 06:37 not be fooled by a lucky run.
- 06:39 And one other thing, the Minitab results in the session window will also provide
- 06:44 you a level of statistical confidence in your analysis.
- 06:47 Finally, there's the confirmation phase of the study,
- 06:51 that last check to make sure that your calculations did not have an error.
- 06:55 It's a great comfort to have that result come in at or
- 06:58 near the level of your predictive performance in the design space equation.
- 07:03 And when you go back to the stakeholders and say, here's the equation,
- 07:07 this is what's predicted, and here's the actual results,
- 07:11 you have made a strong case for a new design or improvement.
- 07:15 So those are the keys for the DOE success, a clear objective,
- 07:19 a quantitive response variable, robust study design that takes
- 07:24 advantage of the design elements, and iterate to success.
Lesson notes are only available for subscribers.
PMI, PMP, CAPM and PMBOK are registered marks of the Project Management Institute, Inc.