Design for Six Sigma

Design for Six Sigma (DFSS) is a business process management method related to traditional Six Sigma. It is used in many industries, like finance, marketing, basic engineering, process industries, waste management, and electronics. It is based on the use of statistical tools like linear regression and enables empirical research similar to that performed in other fields, such as social science. While the tools and order used in Six Sigma require a process to be in place and functioning, DFSS has the objective of determining the needs of customers and the business, and driving those needs into the product solution so created. DFSS is relevant for relatively simple items / systems. It is used for product or process design in contrast with process improvement. Measurement is the most important part of most Six Sigma or DFSS tools, but whereas in Six Sigma measurements are made from an existing process, DFSS focuses on gaining a deep insight into customer needs and using these to inform every design decision and trade-off.

There are different options for the implementation of DFSS. Unlike Six Sigma, which is commonly driven via DMAIC (Define – Measure – Analyze – Improve – Control) projects, DFSS has spawned a number of stepwise processes, all in the style of the DMAIC procedure.

DMADV, define – measure – analyze – design – verify, is sometimes synonymously referred to as DFSS, although alternatives such as IDOV (Identify, Design, Optimize, Verify) are also used. The traditional DMAIC Six Sigma process, as it is usually practiced, which is focused on evolutionary and continuous improvement manufacturing or service process development, usually occurs after initial system or product design and development have been largely completed. DMAIC Six Sigma as practiced is usually consumed with solving existing manufacturing or service process problems and removal of the defects and variation associated with defects. It is clear that manufacturing variations may impact product reliability. So, a clear link should exist between reliability engineering and Six Sigma (quality). In contrast, DFSS (or DMADV and IDOV) strives to generate a new process where none existed, or where an existing process is deemed to be inadequate and in need of replacement. DFSS aims to create a process with the end in mind of optimally building the efficiencies of Six Sigma methodology into the process before implementation; traditional Six Sigma seeks for continuous improvement after a process already exists.

DFSS seeks to avoid manufacturing/service process problems by using advanced techniques to avoid process problems at the outset (e.g., fire prevention). When combined, these methods obtain the proper needs of the customer, and derive engineering system parameter requirements that increase product and service effectiveness in the eyes of the customer and all other people. This yields products and services that provide great customer satisfaction and increased market share. These techniques also include tools and processes to predict, model and simulate the product delivery system (the processes/tools, personnel and organization, training, facilities, and logistics to produce the product/service). In this way, DFSS is closely related to operations research (solving the knapsack problem), workflow balancing. DFSS is largely a design activity requiring tools including: quality function deployment (QFD), axiomatic design, TRIZ, Design for X, design of experiments (DOE), Taguchi methods, tolerance design, robustification and Response Surface Methodology for a single or multiple response optimization. While these tools are sometimes used in the classic DMAIC Six Sigma process, they are uniquely used by DFSS to analyze new and unprecedented products and processes. It is a concurrent analyzes directed to manufacturing optimization related to the design.

Response surface methodology and other DFSS tools uses statistical (often empirical) models, and therefore practitioners need to be aware that even the best statistical model is an approximation to reality. In practice, both the models and the parameter values are unknown, and subject to uncertainty on top of ignorance. Of course, an estimated optimum point need not be optimum in reality, because of the errors of the estimates and of the inadequacies of the model.

Nonetheless, response surface methodology has an effective track-record of helping researchers improve products and services: For example, George Box’s original response-surface modeling enabled chemical engineers to improve a process that had been stuck at a saddle-point for years.

Proponents of DMAIC, DDICA (Design Develop Initialize Control and Allocate) and Lean techniques might claim that DFSS falls under the general rubric of Six Sigma or Lean Six Sigma (LSS). Both methodologies focus on meeting customer needs and business priorities as the starting-point for analysis.

It is often seen that[weasel words] the tools used for DFSS techniques vary widely from those used for DMAIC Six Sigma. In particular, DMAIC, DDICA practitioners often use new or existing mechanical drawings and manufacturing process instructions as the originating information to perform their analysis, while DFSS practitioners often use simulations and parametric system design/analysis tools to predict both cost and performance of candidate system architectures. While it can be claimed that[weasel words] two processes are similar, in practice the working medium differs enough so that DFSS requires different tool sets in order to perform its design tasks. DMAIC, IDOV and Six Sigma may still be used during depth-first plunges into the system architecture analysis and for “back end” Six Sigma processes; DFSS provides system design processes used in front-end complex system designs. Back-front systems also are used. This makes 3.4 defects per million design opportunities if done well.

Traditional six sigma methodology, DMAIC, has become a standard process optimization tool for the chemical process industries.
However, it has become clear that[weasel words] the promise of six sigma, specifically, 3.4 defects per million opportunities (DPMO), is simply unachievable after the fact. Consequently, there has been a growing movement to implement six sigma design usually called design for six sigma DFSS and DDICA tools. This methodology begins with defining customer needs and leads to the development of robust processes to deliver those needs.

Design for Six Sigma emerged from the Six Sigma and the Define-Measure-Analyze-Improve-Control (DMAIC) quality methodologies, which were originally developed by Motorola to systematically improve processes by eliminating defects. Unlike its traditional Six Sigma/DMAIC predecessors, which are usually focused on solving existing manufacturing issues (i.e., “fire fighting”), DFSS aims at avoiding manufacturing problems by taking a more proactive approach to problem solving and engaging the company efforts at an early stage to reduce problems that could occur (i.e., “fire prevention”). The primary goal of DFSS is to achieve a significant reduction in the number of nonconforming units and production variation. It starts from an understanding of the customer expectations, needs and Critical to Quality issues (CTQs) before a design can be completed. Typically in a DFSS program, only a small portion of the CTQs are reliability-related (CTR), and therefore, reliability does not get center stage attention in DFSS. DFSS rarely looks at the long-term (after manufacturing) issues that might arise in the product (e.g. complex fatigue issues or electrical wear-out, chemical issues, cascade effects of failures, system level interactions).

Arguments about what makes DFSS different from Six Sigma demonstrate the similarities between DFSS and other established engineering practices such as probabilistic design and design for quality. In general Six Sigma with its DMAIC roadmap focuses on improvement of an existing process or processes. DFSS focuses on the creation of new value with inputs from customers, suppliers and business needs. While traditional Six Sigma may also use those inputs, the focus is again on improvement and not design of some new product or system. It also shows the engineering background of DFSS. However, like other methods developed in engineering, there is no theoretical reason why DFSS cannot be used in areas outside of engineering.

Historically, although the first successful Design for Six Sigma projects in 1989 and 1991 predate establishment of the DMAIC process improvement process, Design for Six Sigma (DFSS) is accepted in part because Six Sigma organisations found that they could not optimise products past three or four Sigma without fundamentally redesigning the product, and because improving a process or product after launch is considered less efficient and effective than designing in quality. ‘Six Sigma’ levels of performance have to be ‘built-in’.

DFSS for software is essentially a non superficial modification of “classical DFSS” since the character and nature of software is different from other fields of engineering. The methodology describes the detailed process for successfully applying DFSS methods and tools throughout the software product design, covering the overall Software Development life cycle: requirements, architecture, design, implementation, integration, optimization, verification and validation (RADIOV). The methodology explains how to build predictive statistical models for software reliability and robustness and shows how simulation and analysis techniques can be combined with structural design and architecture methods to effectively produce software and information systems at Six Sigma levels.

DFSS in software acts as a glue to blend the classical modelling techniques of software engineering such as object-oriented design or Evolutionary Rapid Development with statistical, predictive models and simulation techniques. The methodology provides Software Engineers with practical tools for measuring and predicting the quality attributes of the software product and also enables them to include software in system reliability models.

Although many tools used in DFSS consulting such as response surface methodology, transfer function via linear and non linear modeling, axiomatic design, simulation have their origin in inferential statistics, statistical modeling may overlap with data analytics and mining,

However, despite that DFSS as a methodology has been successfully used as an end-to-end [technical project frameworks ] for analytic and mining projects, this has been observed by domain experts to be somewhat similar to the lines of CRISP-DM

DFSS is claimed to be better suited for encapsulating and effectively handling higher number of uncertainties including missing and uncertain data, both in terms of acuteness of definition and their absolute total numbers with respect to analytic s and data-mining tasks, six sigma approaches to data-mining are popularly known as DFSS over CRISP [ CRISP- DM referring to data-mining application framework methodology of SPSS ]

With DFSS data mining projects have been observed to have considerably shortened development life cycle . This is typically achieved by conducting data analysis to pre-designed template match tests via a techno-functional approach using multilevel quality function deployment on the data-set.

Practitioners claim that progressively complex KDD templates are created by multiple DOE runs on simulated complex multivariate data, then the templates along with logs are extensively documented via a decision tree based algorithm

DFSS uses Quality Function Deployment and SIPOC for feature engineering of known independent variables, thereby aiding in techno-functional computation of derived attributes

Once the predictive model has been computed, DFSS studies can also be used to provide stronger probabilistic estimations of predictive model rank in a real world scenario

DFSS framework has been successfully applied for predictive analytics pertaining to the HR analytics field, This application field has been considered to be traditionally very challenging due to the peculiar complexities of predicting human behavior.

Design Flow (EDA)

Design flows are the explicit combination of electronic design automation tools to accomplish the design of an integrated circuit. Moore’s law has driven the entire IC implementation RTL to GDSII design flows[clarification needed] from one which uses primarily stand-alone synthesis, placement, and routing algorithms to an integrated construction and analysis flows for design closure. The challenges of rising interconnect delay led to a new way of thinking about and integrating design closure tools.

The RTL to GDSII flow underwent significant changes from 1980 through 2005. The continued scaling of CMOS technologies significantly changed the objectives of the various design steps. The lack of good predictors for delay has led to significant changes in recent design flows. New scaling challenges such as leakage power,
variability, and reliability will continue to require significant changes to the design closure process in the future. Many factors describe what drove the design flow from a set of separate design steps to a fully integrated approach, and what further changes are coming to address the latest challenges. In his keynote at the 40th Design Automation Conference entitled The Tides of EDA, Alberto Sangiovanni-Vincentelli distinguished three periods of EDA:

There are differences between the steps and methods of the design flow for analog and digital integrated circuits. Nonetheless, a typical VLSI design flow consists of various steps like design conceptualization, chip optimization, logical/physical implementation, and design validation and verification.

Transgenerational Design

Transgenerational design is the practice of making products and environments compatible with those physical and sensory impairments associated with human aging and which limit major activities of daily living. The term transgenerational design was coined in 1986, by Syracuse University industrial design professor James J. Pirkl to describe and identify products and environments that accommodate, and appeal to, the widest spectrum of those who would use them—the young, the old, the able, the disabled—without penalty to any group.
The transgenerational design concept emerged from his federally funded design-for-aging research project, Industrial design Accommodations: A Transgenerational Perspective. The project’s two seminal 1988 publications provided detailed information about the aging process; informed and sensitized industrial design professionals and design students about the realities of human aging; and offered a useful set of guidelines and strategies for designing products that accommodate the changing needs of people of all ages and abilities.

The transgenerational design concept establishes a common ground for those who are committed to integrating age and ability within the consumer population. Its underlying principle is that people, including those who are aged or impaired, have an equal right to live in a unified society.

Transgenerational design practice recognizes that human aging is a continuous, dynamic process that starts at birth and ends with death, and that throughout the aging process, people normally experience occurrences of illness, accidents and declines in physical and sensory abilities that impair one’s independence and lifestyle. But most injuries, impairments and disabilities typically occur more frequently as one grows older and experiences the effects of senescence (biological aging). Four facts clarify the interrelationship of age with physical and sensory vulnerability:

Within each situation, consumers expect products and services to fulfill and enhance their lifestyle, both physically and symbolically. Transgenerational design focuses on serving their needs through what Cagan and Vogel call “a value oriented product development process”. They note that a product is “deemed of value to a customer if it offers a strong effect on lifestyle, enabling features, and meaningful ergonomics” resulting in products that are “useful, usable, and desirable” during both short and long term use by people of all ages and abilities.:p.34

Transgenerational design is “framed as a market-aware response to population aging that fulfills the need for products and environments that can be used by both young and old people living and working in the same environment”.:p.16

Transgenerational design benefits all ages and abilities by creating a harmonious bond between products and the people that use them. It satisfies the psychological, physiological, and sociological factors desired—and anticipated—by users of all ages and abilities::p.32

Transgenerational design addresses each element and accommodates the user—regardless of age or ability—by providing a sympathetic fit and unencumbered ease of use. Such designs provide greater accessibility by offering wider options and more choices, thereby preserving and extending one’s independence, and enhancing the quality of life for all ages and abilities—at no group’s expense.

Transgenerational designs accommodate rather than discriminate and sympathize rather than stigmatize. They do this by:

Transgenerational design emerged during the mid-1980s coincident with the conception of universal design, an outgrowth of the disability rights movement and earlier barrier-free concepts. In contrast, transgenerational design grew out of the Age Discrimination Act of 1975 (ADA), which prohibited “discrimination on the basis of age in programs and activities receiving Federal financial assistance”, or excluding, denying or providing different or lesser services on the basis of age. The ensuing political interest and debate over the Act’s 1978 amendments, which abolished mandatory retirement at age 65, made the issues of aging a major public policy concern by injecting it into the mainstream of societal awareness.

At the start of the 1980s, the oldest members of the population, having matured during the great depression, were being replaced by a generation of Baby Boomers, steadily reaching middle age and approaching the threshold of retirement. Their swelling numbers signaled profound demographic changes ahead that would steadily expand the aging population throughout the world.

Advancements in medical research were also changing the image of old age—from a social problem of the sick, poor, and senile, whose solutions depend on public policy—to the emerging reality of an active aging population having vigor, resources, and time to apply both.

Responding to the public’s growing awareness, the media, public policy, and some institutions began to recognize the impending implications. Time and Newsweek devoted cover stories to the “Greying of America”. Local radio stations began replacing their rock-and-roll formats with music targeted to more mature tastes. The Collegiate Forum (Dow Jones & Co., Inc.) devoted its Fall 1982 issue entirely to articles on the aging work force. A National Research Conference on Technology and Aging, and the Office of Technological Assessment of the House of Representatives, initiated a major examination of the impact of science and technology on older Americans”.

In 1985, the National Endowment for the Arts, the Administration on Aging, the Farmer’s Home Administration, and the Department of Housing and Urban Development signed an agreement to improve building, landscape, product and graphic design for older Americans, which included new research applications for old age that recognized the potential for making products easier to use by the elderly, and therefore more appealing and profitable.

In 1987, recognizing the implications of population aging, Syracuse University’s Department of Design, All-University Gerontology Center, and Center for Instructional Development initiated and collaborated on an interdisciplinary project, Industrial Design Accommodations: A Transgenerational Perspective. The year-long project, supported by a Federal grant, joined the knowledge base of gerontology with the professional practice of industrial design.

The project defined “the three aspects of aging as physiological, sociological, and psychological; and divided the designer’s responsibility into aesthetic, technological, and humanistic concerns”.
The strong interrelationship between the physiological aspects of aging and industrial design’s humanistic aspects established the project’s instructional focus and categorized the physiological aspects of aging as the sensory and physical factors of vision, hearing, touch, and movement. This interrelationship was translated into a series of reference tables, which related specific physical and sensory factors of aging, and were included in the resulting set of design guidelines to:

The project produced and published two instructional manuals—one for instructors and one for design professionals—each containing a detailed set of “design guidelines and strategies for designing transgenerationalproducts”. Under terms of the grant, instructional manuals were distributed to all academic programs of industrial design recognized by the National Association of Schools of Art and Design (NASAD).

Continuing to emerge as a growing strategy for developing products, services and environments that accommodate people of all ages and abilities, “transgenerational design has been adopted by major corporations, like Intel, Microsoft and Kodak” who are “looking at product development the same way as designing products for people with visual, hearing and physical impairments,” so that people of any age can use them.

Discussions between designers and marketers are indicating that successful transgenerational design “requires the right balance of upfront research work, solid human factors analysis, extensive design exploration, testing and a lot of thought to get it right”, and that “transgenerational design is applicable to any consumer products company—from appliance manufacturers to electronics companies, furniture makers, kitchen and bath and mainstream consumer products companies”.