Business Process Management

Prince2 Certification
Image by/from nih.gov

Business process management (BPM) is a discipline in operations management in which people use various methods to discover, model, analyze, measure, improve, optimize, and automate business processes. Any combination of methods used to manage a company’s business processes is BPM. Processes can be structured and repeatable or unstructured and variable. Though not required, enabling technologies are often used with BPM.

It can be differentiated from program management in that program management is concerned with managing a group of inter-dependent projects. From another viewpoint, process management includes program management. In project management, process management is the use of a repeatable process to improve the outcome of the project.

Key distinctions between process management and project management are repeatability and predictability. If the structure and sequence of work is unique, then it is a project. In business process management, a sequence of work can vary from instance to instance: there are gateways, conditions; business rules etc. The key is predictability: no matter how many forks in the road, we know all of them in advance, and we understand the conditions for the process to take one route or another. If this condition is met, we are dealing with a process.

As an approach, BPM sees processes as important assets of an organization that must be understood, managed, and developed to announce and deliver value-added products and services to clients or customers. This approach closely resembles other total quality management or continual improvement process methodologies.

ISO 9000 promotes the process approach to managing an organization.

…promotes the adoption of a process approach when developing, implementing and
improving the effectiveness of a quality management system, to enhance customer satisfaction by meeting customer requirements.

BPM proponents also claim that this approach can be supported, or enabled, through technology. As such, many BPM articles and scholars frequently discuss BPM from one of two viewpoints: people and/or technology.

BPMInstitute defined Business process management as:

The Workflow Management Coalition, BPM.com and several other sources use the following definition:

The Association of Business Process Management Professionals defines BPM as:

Gartner defines business process management as:

It is common to confuse BPM with a BPM suite (BPMS). BPM is a professional discipline done by people, whereas a BPMS is a technological suite of tools designed to help the BPM professionals accomplish their goals. BPM should also not be confused with an application or solution developed to support a particular process. Suites and solutions represent ways of automating business processes, but automation is only one aspect of BPM.

The concept of business process may be as traditional as concepts of tasks, department, production, and outputs, arising from job shop scheduling problems in the early 20th Century. The management and improvement approach as of 2010[update], with formal definitions and technical modeling, has been around since the early 1990s (see business process modeling). Note that the term “business process” is sometimes used by IT practitioners as synonymous with the management of middleware processes or with integrating application software tasks.

Although BPM initially focused on the automation of business processes with the use of information technology, it has since been extended[by whom?] to integrate human-driven processes in which human interaction takes place in series or parallel with the use of technology. For example, workflow management systems can assign individual steps requiring deploying human intuition or judgment to relevant humans and other tasks in a workflow to a relevant automated system.

More recent variations such as “human interaction management” are concerned with the interaction between human workers performing a task.

As of 2010[update], technology has allowed the coupling of BPM with other methodologies, such as Six Sigma. Some BPM tools such as SIPOCs, process flows, RACIs, CTQs and histograms allow users to:

This brings with it the benefit of being able to simulate changes to business processes based on real-world data (not just on assumed knowledge). Also, the coupling of BPM to industry methodologies allows users to continually streamline and optimize the process to ensure that it is tuned to its market need.[full citation needed]

As of 2012[update] research on BPM has paid increasing attention to the compliance of business processes. Although a key aspect of business processes is flexibility, as business processes continuously need to adapt to changes in the environment, compliance with business strategy, policies, and government regulations should also be ensured.
The compliance aspect in BPM is highly important for governmental organizations. As of 2010[update] BPM approaches in a governmental context largely focus on operational processes and knowledge representation.
There have been many technical studies on operational business processes in the public and private sectors, but researchers rarely take legal compliance activities into account—for instance, the legal implementation processes in public-administration bodies.

Business process management activities can be arbitrarily grouped into categories such as design, modeling, execution, monitoring, and optimization.

Process design encompasses both the identification of existing processes and the design of “to-be” processes. Areas of focus include representation of the process flow, the factors within it, alerts and notifications, escalations, standard operating procedures, service level agreements, and task hand-over mechanisms. Whether or not existing processes are considered, the aim of this step is to ensure a correct and efficient new design.

The proposed improvement could be in human-to-human, human-to-system or system-to-system workflows, and might target regulatory, market, or competitive challenges faced by the businesses. Existing processes and design of a new process for various applications must synchronize and not cause a major outage or process interruption.

Modeling takes the theoretical design and introduces combinations of variables (e.g., changes in rent or materials costs, which determine how the process might operate under different circumstances).

It may also involve running “what-if analysis”(Conditions-when, if, else) on the processes: “What if I have 75% of resources to do the same task?” “What if I want to do the same job for 80% of the current cost?”.

Business process execution is broadly about enacting a discovered and modeled business process. Enacting a business process is done manually or automatically or with a combination of manual and automated business tasks. Manual business processes are human-driven. Automated business processes are software-driven. Business process automation encompasses methods and software deployed for automating business processes.

Business process automation is performed and orchestrated at the business process layer or the consumer presentation layer of SOA Reference Architecture. BPM software suites such as BPMS or iBPMS or low-code platforms are positioned at the business process layer. While the emerging robotic process automation software performs business process automation at the presentation layer, therefore is considered non-invasive to and de-coupled from existing application systems.

One of the ways to automate processes is to develop or purchase an application that executes the required steps of the process; however, in practice, these applications rarely execute all the steps of the process accurately or completely. Another approach is to use a combination of software and human intervention; however this approach is more complex, making the documentation process difficult.

In response to these problems, companies have developed software that defines the full business process (as developed in the process design activity) in a computer language that a computer can directly execute. Process models can be run through execution engines that automate the processes directly from the model (e.g., calculating a repayment plan for a loan) or, when a step is too complex to automate, Business Process Modeling Notation (BPMN) provides front-end capability for human input. Compared to either of the previous approaches, directly executing a process definition can be more straightforward and therefore easier to improve. However, automating a process definition requires flexible and comprehensive infrastructure, which typically rules out implementing these systems in a legacy IT environment.

Business rules have been used by systems to provide definitions for governing behavior, and a business rule engine can be used to drive process execution and resolution.

Monitoring encompasses the tracking of individual processes, so that information on their state can be easily seen, and statistics on the performance of one or more processes can be provided. An example of this tracking is being able to determine the state of a customer order (e.g. order arrived, awaiting delivery, invoice paid) so that problems in its operation can be identified and corrected.

In addition, this information can be used to work with customers and suppliers to improve their connected processes. Examples are the generation of measures on how quickly a customer order is processed or how many orders were processed in the last month. These measures tend to fit into three categories: cycle time, defect rate and productivity.

The degree of monitoring depends on what information the business wants to evaluate and analyze and how the business wants it monitored, in real-time, near real-time or ad hoc. Here, business activity monitoring (BAM) extends and expands the monitoring tools generally provided by BPMS.

Process mining is a collection of methods and tools related to process monitoring. The aim of process mining is to analyze event logs extracted through process monitoring and to compare them with an a priori process model. Process mining allows process analysts to detect discrepancies between the actual process execution and the a priori model as well as to analyze bottlenecks.

Predictive Business Process Monitoring concerns the application of data mining, machine learning, and other forecasting techniques to predict what is going to happen with running instances of a business process, allowing to make forecasts of future cycle time, compliance issues, etc. Techniques for predictive business process monitoring include Support Vector Machines, Deep Learning approaches, and Random Forest.

Process optimization includes retrieving process performance information from modeling or monitoring phase; identifying the potential or actual bottlenecks and the potential opportunities for cost savings or other improvements; and then, applying those enhancements in the design of the process. Process mining tools are able to discover critical activities and bottlenecks, creating greater business value.

When the process becomes too complex or inefficient, and optimization is not fetching the desired output, it is usually recommended by a company steering committee chaired by the president / CEO to re-engineer the entire process cycle. Business process reengineering (BPR) has been used by organizations to attempt to achieve efficiency and productivity at work.

A market has developed for enterprise software leveraging the business process management concepts to organize and automate processes. The recent convergence of this software from distinct pieces such as business rules engine, business process modelling, business activity monitoring and Human Workflow has given birth to integrated Business Process Management Suites.
Forrester Research, Inc recognize the BPM suite space through three different lenses:

However, standalone integration-centric and document-centric offerings have matured into separate, standalone markets.

Rapid application development using no-code/low-code principles is becoming an ever prevalent feature of BPMS platforms. RAD enables businesses to deploy applications more quickly and more cost effectively, while also offering improved change and version management. Gartner notes that as businesses embrace these systems, their budgets rely less on the maintenance of existing systems and show more investment in growing and transforming them.

While the steps can be viewed as a cycle, economic or time constraints are likely to limit the process to only a few iterations. This is often the case when an organization uses the approach for short to medium term objectives rather than trying to transform the organizational culture. True iterations are only possible through the collaborative efforts of process participants. In a majority of organizations, complexity requires enabling technology (see below) to support the process participants in these daily process management challenges.

To date, many organizations often start a BPM project or program with the objective of optimizing an area that has been identified as an area for improvement.

Currently, the international standards for the task have limited BPM to the application in the IT sector, and ISO/IEC 15944 covers the operational aspects of the business. However, some corporations with the culture of best practices do use standard operating procedures to regulate their operational process. Other standards are currently being worked upon to assist in BPM implementation (BPMN, enterprise architecture, Business Motivation Model).

BPM is now considered a critical component of operational intelligence (OI) solutions to deliver real-time, actionable information. This real-time information can be acted upon in a variety of ways – alerts can be sent or executive decisions can be made using real-time dashboards. OI solutions use real-time information to take automated action based on pre-defined rules so that security measures and or exception management processes can be initiated.
Because “the size and complexity of daily tasks often requires the use of technology to model efficiently” when resources in technology became increasingly widespread with general availability to businesses to provide to their staff, “Many thought BPM as the bridge between Information Technology (IT) and Business.”

There are four critical components of a BPM Suite:

BPM also addresses many of the critical IT issues underpinning these business drivers, including:

Validation of BPMS is another technical issue that vendors and users must be aware of, if regulatory compliance is mandatory. The validation task could be performed either by an authenticated third party or by the users themselves. Either way, validation documentation must be generated. The validation document usually can either be published officially or retained by users.

Cloud computing business process management is the use of (BPM) tools that are delivered as software services (SaaS) over a network. Cloud BPM business logic is deployed on an application server and the business data resides in cloud storage.

According to Gartner, 20% of all the “shadow business processes” are supported by BPM cloud platforms. Gartner refers to all the hidden organizational processes that are supported by IT departments as part of legacy business processes such as Excel spreadsheets, routing of emails using rules, phone calls routing, etc. These can, of course also be replaced by other technologies such as workflow and smart form software.

The benefits of using cloud BPM services include removing the need and cost of maintaining specialized technical skill sets in-house and reducing distractions from an enterprise’s main focus. It offers controlled IT budgeting and enables geographical mobility.[full citation needed].

The emerging Internet of things poses a significant challenge to control and manage the flow of information through large numbers of devices. To cope with this, a new direction known as BPM Everywhere shows promise as a way of blending traditional process techniques, with additional capabilities to automate the handling of all the independent devices.

Design for Six Sigma

Design for Six Sigma (DFSS) is a business process management method related to traditional Six Sigma. It is used in many industries, like finance, marketing, basic engineering, process industries, waste management, and electronics. It is based on the use of statistical tools like linear regression and enables empirical research similar to that performed in other fields, such as social science. While the tools and order used in Six Sigma require a process to be in place and functioning, DFSS has the objective of determining the needs of customers and the business, and driving those needs into the product solution so created. DFSS is relevant for relatively simple items / systems. It is used for product or process design in contrast with process improvement. Measurement is the most important part of most Six Sigma or DFSS tools, but whereas in Six Sigma measurements are made from an existing process, DFSS focuses on gaining a deep insight into customer needs and using these to inform every design decision and trade-off.

There are different options for the implementation of DFSS. Unlike Six Sigma, which is commonly driven via DMAIC (Define – Measure – Analyze – Improve – Control) projects, DFSS has spawned a number of stepwise processes, all in the style of the DMAIC procedure.

DMADV, define – measure – analyze – design – verify, is sometimes synonymously referred to as DFSS, although alternatives such as IDOV (Identify, Design, Optimize, Verify) are also used. The traditional DMAIC Six Sigma process, as it is usually practiced, which is focused on evolutionary and continuous improvement manufacturing or service process development, usually occurs after initial system or product design and development have been largely completed. DMAIC Six Sigma as practiced is usually consumed with solving existing manufacturing or service process problems and removal of the defects and variation associated with defects. It is clear that manufacturing variations may impact product reliability. So, a clear link should exist between reliability engineering and Six Sigma (quality). In contrast, DFSS (or DMADV and IDOV) strives to generate a new process where none existed, or where an existing process is deemed to be inadequate and in need of replacement. DFSS aims to create a process with the end in mind of optimally building the efficiencies of Six Sigma methodology into the process before implementation; traditional Six Sigma seeks for continuous improvement after a process already exists.

DFSS seeks to avoid manufacturing/service process problems by using advanced techniques to avoid process problems at the outset (e.g., fire prevention). When combined, these methods obtain the proper needs of the customer, and derive engineering system parameter requirements that increase product and service effectiveness in the eyes of the customer and all other people. This yields products and services that provide great customer satisfaction and increased market share. These techniques also include tools and processes to predict, model and simulate the product delivery system (the processes/tools, personnel and organization, training, facilities, and logistics to produce the product/service). In this way, DFSS is closely related to operations research (solving the knapsack problem), workflow balancing. DFSS is largely a design activity requiring tools including: quality function deployment (QFD), axiomatic design, TRIZ, Design for X, design of experiments (DOE), Taguchi methods, tolerance design, robustification and Response Surface Methodology for a single or multiple response optimization. While these tools are sometimes used in the classic DMAIC Six Sigma process, they are uniquely used by DFSS to analyze new and unprecedented products and processes. It is a concurrent analyzes directed to manufacturing optimization related to the design.

Response surface methodology and other DFSS tools uses statistical (often empirical) models, and therefore practitioners need to be aware that even the best statistical model is an approximation to reality. In practice, both the models and the parameter values are unknown, and subject to uncertainty on top of ignorance. Of course, an estimated optimum point need not be optimum in reality, because of the errors of the estimates and of the inadequacies of the model.

Nonetheless, response surface methodology has an effective track-record of helping researchers improve products and services: For example, George Box’s original response-surface modeling enabled chemical engineers to improve a process that had been stuck at a saddle-point for years.

Proponents of DMAIC, DDICA (Design Develop Initialize Control and Allocate) and Lean techniques might claim that DFSS falls under the general rubric of Six Sigma or Lean Six Sigma (LSS). Both methodologies focus on meeting customer needs and business priorities as the starting-point for analysis.

It is often seen that[weasel words] the tools used for DFSS techniques vary widely from those used for DMAIC Six Sigma. In particular, DMAIC, DDICA practitioners often use new or existing mechanical drawings and manufacturing process instructions as the originating information to perform their analysis, while DFSS practitioners often use simulations and parametric system design/analysis tools to predict both cost and performance of candidate system architectures. While it can be claimed that[weasel words] two processes are similar, in practice the working medium differs enough so that DFSS requires different tool sets in order to perform its design tasks. DMAIC, IDOV and Six Sigma may still be used during depth-first plunges into the system architecture analysis and for “back end” Six Sigma processes; DFSS provides system design processes used in front-end complex system designs. Back-front systems also are used. This makes 3.4 defects per million design opportunities if done well.

Traditional six sigma methodology, DMAIC, has become a standard process optimization tool for the chemical process industries.
However, it has become clear that[weasel words] the promise of six sigma, specifically, 3.4 defects per million opportunities (DPMO), is simply unachievable after the fact. Consequently, there has been a growing movement to implement six sigma design usually called design for six sigma DFSS and DDICA tools. This methodology begins with defining customer needs and leads to the development of robust processes to deliver those needs.

Design for Six Sigma emerged from the Six Sigma and the Define-Measure-Analyze-Improve-Control (DMAIC) quality methodologies, which were originally developed by Motorola to systematically improve processes by eliminating defects. Unlike its traditional Six Sigma/DMAIC predecessors, which are usually focused on solving existing manufacturing issues (i.e., “fire fighting”), DFSS aims at avoiding manufacturing problems by taking a more proactive approach to problem solving and engaging the company efforts at an early stage to reduce problems that could occur (i.e., “fire prevention”). The primary goal of DFSS is to achieve a significant reduction in the number of nonconforming units and production variation. It starts from an understanding of the customer expectations, needs and Critical to Quality issues (CTQs) before a design can be completed. Typically in a DFSS program, only a small portion of the CTQs are reliability-related (CTR), and therefore, reliability does not get center stage attention in DFSS. DFSS rarely looks at the long-term (after manufacturing) issues that might arise in the product (e.g. complex fatigue issues or electrical wear-out, chemical issues, cascade effects of failures, system level interactions).

Arguments about what makes DFSS different from Six Sigma demonstrate the similarities between DFSS and other established engineering practices such as probabilistic design and design for quality. In general Six Sigma with its DMAIC roadmap focuses on improvement of an existing process or processes. DFSS focuses on the creation of new value with inputs from customers, suppliers and business needs. While traditional Six Sigma may also use those inputs, the focus is again on improvement and not design of some new product or system. It also shows the engineering background of DFSS. However, like other methods developed in engineering, there is no theoretical reason why DFSS cannot be used in areas outside of engineering.

Historically, although the first successful Design for Six Sigma projects in 1989 and 1991 predate establishment of the DMAIC process improvement process, Design for Six Sigma (DFSS) is accepted in part because Six Sigma organisations found that they could not optimise products past three or four Sigma without fundamentally redesigning the product, and because improving a process or product after launch is considered less efficient and effective than designing in quality. ‘Six Sigma’ levels of performance have to be ‘built-in’.

DFSS for software is essentially a non superficial modification of “classical DFSS” since the character and nature of software is different from other fields of engineering. The methodology describes the detailed process for successfully applying DFSS methods and tools throughout the software product design, covering the overall Software Development life cycle: requirements, architecture, design, implementation, integration, optimization, verification and validation (RADIOV). The methodology explains how to build predictive statistical models for software reliability and robustness and shows how simulation and analysis techniques can be combined with structural design and architecture methods to effectively produce software and information systems at Six Sigma levels.

DFSS in software acts as a glue to blend the classical modelling techniques of software engineering such as object-oriented design or Evolutionary Rapid Development with statistical, predictive models and simulation techniques. The methodology provides Software Engineers with practical tools for measuring and predicting the quality attributes of the software product and also enables them to include software in system reliability models.

Although many tools used in DFSS consulting such as response surface methodology, transfer function via linear and non linear modeling, axiomatic design, simulation have their origin in inferential statistics, statistical modeling may overlap with data analytics and mining,

However, despite that DFSS as a methodology has been successfully used as an end-to-end [technical project frameworks ] for analytic and mining projects, this has been observed by domain experts to be somewhat similar to the lines of CRISP-DM

DFSS is claimed to be better suited for encapsulating and effectively handling higher number of uncertainties including missing and uncertain data, both in terms of acuteness of definition and their absolute total numbers with respect to analytic s and data-mining tasks, six sigma approaches to data-mining are popularly known as DFSS over CRISP [ CRISP- DM referring to data-mining application framework methodology of SPSS ]

With DFSS data mining projects have been observed to have considerably shortened development life cycle . This is typically achieved by conducting data analysis to pre-designed template match tests via a techno-functional approach using multilevel quality function deployment on the data-set.

Practitioners claim that progressively complex KDD templates are created by multiple DOE runs on simulated complex multivariate data, then the templates along with logs are extensively documented via a decision tree based algorithm

DFSS uses Quality Function Deployment and SIPOC for feature engineering of known independent variables, thereby aiding in techno-functional computation of derived attributes

Once the predictive model has been computed, DFSS studies can also be used to provide stronger probabilistic estimations of predictive model rank in a real world scenario

DFSS framework has been successfully applied for predictive analytics pertaining to the HR analytics field, This application field has been considered to be traditionally very challenging due to the peculiar complexities of predicting human behavior.