Change Control

15Change control within quality management systems (QMS) and information technology (IT) systems is a process—either formal or informal—used to ensure that changes to a product or system are introduced in a controlled and coordinated manner. It reduces the possibility that unnecessary changes will be introduced to a system without forethought, introducing faults into the system or undoing changes made by other users of software. The goals of a change control procedure usually include minimal disruption to services, reduction in back-out activities, and cost-effective utilization of resources involved in implementing change.

Change control is used in various industries, including in IT, software development, the pharmaceutical industry, the medical device industry, and other engineering/manufacturing industries. For the IT and software industries, change control is a major aspect of the broader discipline of change management. Typical examples from the computer and network environments are patches to software products, installation of new operating systems, upgrades to network routing tables, or changes to the electrical power systems supporting such infrastructure.

Certain portions of the Information Technology Infrastructure Library cover change control.

There is considerable overlap and confusion between change management, configuration management and change control. The definition below is not yet integrated with definitions of the others.

Change control can be described as a set of six steps:

Consider the primary and ancillary details of the proposed change. Should include aspects such as identifying the change, its owner(s), how it will be communicated and executed, how success will be verified, the change’s estimate of importance, its added value, its conformity to business and industry standards, and its target date for completion.

Impact and risk assessment is the next vital step. When executed, will the proposed plan cause something to go wrong? Will related systems be impacted by the proposed change? Even minor details should be considered during this phase. Afterwards, a risk category should ideally be assigned to the proposed change: high-, moderate-, or low-risk. High-risk change requires many additional steps such as management approval and stakeholder notification, whereas low-risk change may only require project manager approval and minimal documentation. If not addressed in the plan/scope, the desire for a backout plan should be expressed, particularly for high-risk changes that have significant worst-case scenarios.

Whether it’s a change controller, change control board, steering committee, or project manager, a review and approval process is typically required. The plan/scope and impact/risk assessments are considered in the context of business goals, requirements, and resources. If, for example, the change request is deemed to address a low severity, low impact issue that requires significant resources to correct, the request may be made low priority or shelved altogether. In cases where a high-impact change is requested but without a strong plan, the review/approval entity may request a full business case may be requested for further analysis.

If the change control request is approved to move forward, the delivery team will execute the solution through a small-scale development process in test or development environments. This allows the delivery team an opportunity to design and make incremental changes, with unit and/or regression testing. Little in the way of testing and validation may occur for low-risk changes, though major changes will require significant testing before implementation. They will then seek approval and request a time and date to carry out the implementation phase. In rare cases where the solution can’t be tested, special consideration should be made towards the change/implementation window.

In most cases a special implementation team with the technical expertise to quickly move a change along is used to implement the change. The team should also be implementing the change not only according to the approved plan but also according to organizational standards, industry standards, and quality management standards. The implementation process may also require additional staff responsibilities outside the implementation team, including stakeholders who may be asked to assist with troubleshooting. Following implementation, the team may also carry out a post-implementation review, which would take place at another stakeholder meeting or during project closing procedures.

The closing process can be one of the more difficult and important phases of change control. Three primary tasks at this end phase include determining that the project is actually complete, evaluating “the project plan in the context of project completion,” and providing tangible proof of project success. If despite best efforts something went wrong during the change control process, a post-mortem on what happened will need to be run, with the intent of applying lessons learned to future changes.

In a Good Manufacturing Practice regulated industry, the topic is frequently encountered by its users. Various industrial guidances and commentaries are available for people to comprehend this concept. As a common practice, the activity is usually directed by one or more SOPs. From the information technology perspective for clinical trials, it has been guided by another U.S. Food and Drug Administration document.

Get More PRINCE2 Information by Clicking HERE

Change Management

Change management (sometimes abbreviated as CM) is a collective term for all approaches to prepare, support, and help individuals, teams, and organizations in making organizational change. The most common change drivers include: technological evolution, process reviews, crisis, and consumer habit changes; pressure from new business entrants, acquisitions, mergers, and organizational restructuring. It includes methods that redirect or redefine the use of resources, business process, budget allocations, or other modes of operation that significantly change a company or organization. Organizational change management (OCM) considers the full organization and what needs to change, while change management may be used solely to refer to how people and teams are affected by such organizational transition. It deals with many different disciplines, from behavioral and social sciences to information technology and business solutions.

In a project-management context, the term “change management” may be used as an alternative to change control processes wherein changes to the scope of a project are formally introduced and approved.

Many change management models and processes are based with their roots in grief studies. As consultants saw a correlation between grieving from health-related issues and grieving among employees in an organization due to loss of jobs and departments, many early change models captured the full range of human emotions as employees mourned job-related transitions.

In his work on diffusion of innovations, Everett Rogers posited that change must be understood in the context of time, communication channels, and its impact on all affected participants. Placing people at the core of change thinking was a fundamental contribution to developing the concept of change management. He proposed the descriptive Adopter groups of how people respond to change: Innovators, Early Adopters, Early Majority, Late Majority and Laggards.

McKinsey & Company consultant Julien Phillips published a change management model in 1982 in the journal Human Resource Management, though it took a decade for his change management peers to catch up with him.

Robert Marshak has since credited the big six accounting and consulting firms with adopting the work of early organizational change pioneers, such as Daryl Conner and Don Harrison, thereby contributing to the legitimization of a whole change management industry when they branded their reengineering services as change management in the 1980s.

In his 1993 book, Managing at the Speed of Change, Daryl Conner coined the term ‘burning platform’ based on the 1988 North Sea Piper Alpha oil rig fire. He went on to found Conner Partners in 1994, focusing on the human performance and adoption techniques that would help ensure technology innovations were absorbed and adopted as best as possible. The first State of the Change Management Industry report was published in the Consultants News in February 1995.

Linda Ackerman Anderson states in Beyond Change Management that in the late 1980s and early 1990s, top leaders, growing dissatisfied with the failures of creating and implementing changes in a top-down fashion, created the role of the change leader to take responsibility for the human side of change.

In Australia, change management is now recognised as a formal vocation through the work of Christina Dean with the Australian government in establishing national competency standards and academic programmes from diploma to masters level.

In response to continuing reports of the failure of large-scale top-down plan-driven change programmes, innovative change practitioners have been reporting success with applying Lean and Agile principles to the field of change management.

The Association of Change Management Professionals (ACMP) announced a new certification to enhance the profession: Certified Change Management Professional (CCMP) in 2016.

Organizational change management employs a structured approach to ensure that changes are implemented smoothly and successfully to achieve lasting benefits.

Globalization and constant innovation of technology result in a constantly evolving business environment. Phenomena such as social media and mobile adaptability have revolutionized business and the effect of this is an ever-increasing need for change, and therefore change management. The growth in technology also has a secondary effect of increasing the availability and therefore accountability of knowledge. Easily accessible information has resulted in unprecedented scrutiny from stockholders and the media and pressure on management. With the business environment experiencing so much change, organizations must then learn to become comfortable with change as well. Therefore, the ability to manage and adapt to organizational change is an essential ability required in the workplace today. Yet, major and rapid organizational change is profoundly difficult because the structure, culture, and routines of organizations often reflect a persistent and difficult-to-remove “imprint” of past periods, which are resistant to radical change even as the current environment of the organization changes rapidly.

Due to the growth of technology, modern organizational change is largely motivated by exterior innovations rather than internal factors. When these developments occur, the organizations that adapt quickest create a competitive advantage for themselves, while the companies that refuse to change get left behind. This can result in drastic profit and/or market share losses. Organizational change directly affects all departments and employees. The entire company must learn how to handle changes to the organization. The effectiveness of change management can have a strong positive or negative impact on employee morale.

There are several models of change management:

Dr. John P. Kotter, the Konosuke Matsushita Professor of Leadership, Emeritus, at the Harvard Business School, invented the 8-Step Process for Leading Change. It consists of eight stages:

The Change Management Foundation is shaped like a pyramid with project management managing technical aspects and people implementing change at the base and leadership setting the direction at the top. The Change Management Model consists of four stages:

The Plan-Do-Check-Act Cycle, created by W. Edwards Deming, is a management method to improve business method for control and continuous improvement of choosing which changes to implement. When determining which of the latest techniques or innovations to adopt, there are four major factors to be considered:

Although there are many types of organizational changes, the critical aspect is a company’s ability to win the buy-in of their organization’s employees on the change. Effectively managing organizational change is a four-step process:

As a multi-disciplinary practice that has evolved as a result of scholarly research, organizational change management should begin with a systematic diagnosis of the current situation in order to determine both the need for change and the capability to change. The objectives, content, and process of change should all be specified as part of a change management plan. Change management processes should include creative marketing to enable communication between changing audiences, as well as deep social understanding about leadership styles and group dynamics. As a visible track on transformation projects, organizational change management aligns groups’ expectations, integrates teams, and manages employee-training. It makes use of performance metrics, such as financial results, operational efficiency, leadership commitment, communication effectiveness, and the perceived need for change in order to design appropriate strategies, resolve troubled change projects, and avoid change failures.

Successful change management is more likely to occur if the following are included:

Change management is faced with the fundamental difficulties of integration and navigation, and human factors. Change management must also take into account the human aspect where emotions and how they are handled play a significant role in implementing change successfully.

Traditionally, organizational development (OD) departments overlooked the role of infrastructure and the possibility of carrying out change through technology. Now, managers almost exclusively focus on the structural and technical components of change. Alignment and integration between strategic, social, and technical components requires collaboration between people with different skill-sets.

Managing change over time, referred to as navigation, requires continuous adaptation. It requires managing projects over time against a changing context, from inter-organizational factors to marketplace volatility. It also requires a balance in bureaucratic organizations between top-down and bottom-up management, ensuring employee empowerment and flexibility.

One of the major factors which hinders the change management process is people’s natural tendency for inertia. Just as in Newton’s first law of motion, people are resistant to change in organizations because it can be uncomfortable. The notion of doing things this way, because ‘this is the way we have always done them’, can be particularly hard to overcome. Furthermore, in cases where a company has seen declining fortunes, for a manager or executive to view themselves as a key part of the problem can be very humbling. This issue can be exacerbated in countries where “saving face” plays a large role in inter-personal relations.

To assist with this, a number of models have been developed which help identify their readiness for change and then to recommend the steps through which they could move. A common example is ADKAR, an acronym which stands for Awareness, Desire, Knowledge, Ability and Reinforcement. This model was developed by researcher and entrepreneur Jeff Hiatt in 1996 and first published in a white paper entitled The Perfect Change in 1999. Hiatt explained that the process of becoming ready for change is sequential, starting from the current level of each individual, and none of the five steps could be avoided: “they cannot be skipped or reordered”.

As change management becomes more necessary in the business cycle of organizations, it is beginning to be taught as its own academic discipline at universities. There are a growing number of universities with research units dedicated to the study of organizational change.

Get More PRINCE2 Information by Clicking HERE

Business Process Management

Prince2 Certification
Image by/from nih.gov

Business process management (BPM) is a discipline in operations management in which people use various methods to discover, model, analyze, measure, improve, optimize, and automate business processes. Any combination of methods used to manage a company’s business processes is BPM. Processes can be structured and repeatable or unstructured and variable. Though not required, enabling technologies are often used with BPM.

It can be differentiated from program management in that program management is concerned with managing a group of inter-dependent projects. From another viewpoint, process management includes program management. In project management, process management is the use of a repeatable process to improve the outcome of the project.

Key distinctions between process management and project management are repeatability and predictability. If the structure and sequence of work is unique, then it is a project. In business process management, a sequence of work can vary from instance to instance: there are gateways, conditions; business rules etc. The key is predictability: no matter how many forks in the road, we know all of them in advance, and we understand the conditions for the process to take one route or another. If this condition is met, we are dealing with a process.

As an approach, BPM sees processes as important assets of an organization that must be understood, managed, and developed to announce and deliver value-added products and services to clients or customers. This approach closely resembles other total quality management or continual improvement process methodologies.

ISO 9000 promotes the process approach to managing an organization.

…promotes the adoption of a process approach when developing, implementing and
improving the effectiveness of a quality management system, to enhance customer satisfaction by meeting customer requirements.

BPM proponents also claim that this approach can be supported, or enabled, through technology. As such, many BPM articles and scholars frequently discuss BPM from one of two viewpoints: people and/or technology.

BPMInstitute defined Business process management as:

The Workflow Management Coalition, BPM.com and several other sources use the following definition:

The Association of Business Process Management Professionals defines BPM as:

Gartner defines business process management as:

It is common to confuse BPM with a BPM suite (BPMS). BPM is a professional discipline done by people, whereas a BPMS is a technological suite of tools designed to help the BPM professionals accomplish their goals. BPM should also not be confused with an application or solution developed to support a particular process. Suites and solutions represent ways of automating business processes, but automation is only one aspect of BPM.

The concept of business process may be as traditional as concepts of tasks, department, production, and outputs, arising from job shop scheduling problems in the early 20th Century. The management and improvement approach as of 2010[update], with formal definitions and technical modeling, has been around since the early 1990s (see business process modeling). Note that the term “business process” is sometimes used by IT practitioners as synonymous with the management of middleware processes or with integrating application software tasks.

Although BPM initially focused on the automation of business processes with the use of information technology, it has since been extended[by whom?] to integrate human-driven processes in which human interaction takes place in series or parallel with the use of technology. For example, workflow management systems can assign individual steps requiring deploying human intuition or judgment to relevant humans and other tasks in a workflow to a relevant automated system.

More recent variations such as “human interaction management” are concerned with the interaction between human workers performing a task.

As of 2010[update], technology has allowed the coupling of BPM with other methodologies, such as Six Sigma. Some BPM tools such as SIPOCs, process flows, RACIs, CTQs and histograms allow users to:

This brings with it the benefit of being able to simulate changes to business processes based on real-world data (not just on assumed knowledge). Also, the coupling of BPM to industry methodologies allows users to continually streamline and optimize the process to ensure that it is tuned to its market need.[full citation needed]

As of 2012[update] research on BPM has paid increasing attention to the compliance of business processes. Although a key aspect of business processes is flexibility, as business processes continuously need to adapt to changes in the environment, compliance with business strategy, policies, and government regulations should also be ensured.
The compliance aspect in BPM is highly important for governmental organizations. As of 2010[update] BPM approaches in a governmental context largely focus on operational processes and knowledge representation.
There have been many technical studies on operational business processes in the public and private sectors, but researchers rarely take legal compliance activities into account—for instance, the legal implementation processes in public-administration bodies.

Business process management activities can be arbitrarily grouped into categories such as design, modeling, execution, monitoring, and optimization.

Process design encompasses both the identification of existing processes and the design of “to-be” processes. Areas of focus include representation of the process flow, the factors within it, alerts and notifications, escalations, standard operating procedures, service level agreements, and task hand-over mechanisms. Whether or not existing processes are considered, the aim of this step is to ensure a correct and efficient new design.

The proposed improvement could be in human-to-human, human-to-system or system-to-system workflows, and might target regulatory, market, or competitive challenges faced by the businesses. Existing processes and design of a new process for various applications must synchronize and not cause a major outage or process interruption.

Modeling takes the theoretical design and introduces combinations of variables (e.g., changes in rent or materials costs, which determine how the process might operate under different circumstances).

It may also involve running “what-if analysis”(Conditions-when, if, else) on the processes: “What if I have 75% of resources to do the same task?” “What if I want to do the same job for 80% of the current cost?”.

Business process execution is broadly about enacting a discovered and modeled business process. Enacting a business process is done manually or automatically or with a combination of manual and automated business tasks. Manual business processes are human-driven. Automated business processes are software-driven. Business process automation encompasses methods and software deployed for automating business processes.

Business process automation is performed and orchestrated at the business process layer or the consumer presentation layer of SOA Reference Architecture. BPM software suites such as BPMS or iBPMS or low-code platforms are positioned at the business process layer. While the emerging robotic process automation software performs business process automation at the presentation layer, therefore is considered non-invasive to and de-coupled from existing application systems.

One of the ways to automate processes is to develop or purchase an application that executes the required steps of the process; however, in practice, these applications rarely execute all the steps of the process accurately or completely. Another approach is to use a combination of software and human intervention; however this approach is more complex, making the documentation process difficult.

In response to these problems, companies have developed software that defines the full business process (as developed in the process design activity) in a computer language that a computer can directly execute. Process models can be run through execution engines that automate the processes directly from the model (e.g., calculating a repayment plan for a loan) or, when a step is too complex to automate, Business Process Modeling Notation (BPMN) provides front-end capability for human input. Compared to either of the previous approaches, directly executing a process definition can be more straightforward and therefore easier to improve. However, automating a process definition requires flexible and comprehensive infrastructure, which typically rules out implementing these systems in a legacy IT environment.

Business rules have been used by systems to provide definitions for governing behavior, and a business rule engine can be used to drive process execution and resolution.

Monitoring encompasses the tracking of individual processes, so that information on their state can be easily seen, and statistics on the performance of one or more processes can be provided. An example of this tracking is being able to determine the state of a customer order (e.g. order arrived, awaiting delivery, invoice paid) so that problems in its operation can be identified and corrected.

In addition, this information can be used to work with customers and suppliers to improve their connected processes. Examples are the generation of measures on how quickly a customer order is processed or how many orders were processed in the last month. These measures tend to fit into three categories: cycle time, defect rate and productivity.

The degree of monitoring depends on what information the business wants to evaluate and analyze and how the business wants it monitored, in real-time, near real-time or ad hoc. Here, business activity monitoring (BAM) extends and expands the monitoring tools generally provided by BPMS.

Process mining is a collection of methods and tools related to process monitoring. The aim of process mining is to analyze event logs extracted through process monitoring and to compare them with an a priori process model. Process mining allows process analysts to detect discrepancies between the actual process execution and the a priori model as well as to analyze bottlenecks.

Predictive Business Process Monitoring concerns the application of data mining, machine learning, and other forecasting techniques to predict what is going to happen with running instances of a business process, allowing to make forecasts of future cycle time, compliance issues, etc. Techniques for predictive business process monitoring include Support Vector Machines, Deep Learning approaches, and Random Forest.

Process optimization includes retrieving process performance information from modeling or monitoring phase; identifying the potential or actual bottlenecks and the potential opportunities for cost savings or other improvements; and then, applying those enhancements in the design of the process. Process mining tools are able to discover critical activities and bottlenecks, creating greater business value.

When the process becomes too complex or inefficient, and optimization is not fetching the desired output, it is usually recommended by a company steering committee chaired by the president / CEO to re-engineer the entire process cycle. Business process reengineering (BPR) has been used by organizations to attempt to achieve efficiency and productivity at work.

A market has developed for enterprise software leveraging the business process management concepts to organize and automate processes. The recent convergence of this software from distinct pieces such as business rules engine, business process modelling, business activity monitoring and Human Workflow has given birth to integrated Business Process Management Suites.
Forrester Research, Inc recognize the BPM suite space through three different lenses:

However, standalone integration-centric and document-centric offerings have matured into separate, standalone markets.

Rapid application development using no-code/low-code principles is becoming an ever prevalent feature of BPMS platforms. RAD enables businesses to deploy applications more quickly and more cost effectively, while also offering improved change and version management. Gartner notes that as businesses embrace these systems, their budgets rely less on the maintenance of existing systems and show more investment in growing and transforming them.

While the steps can be viewed as a cycle, economic or time constraints are likely to limit the process to only a few iterations. This is often the case when an organization uses the approach for short to medium term objectives rather than trying to transform the organizational culture. True iterations are only possible through the collaborative efforts of process participants. In a majority of organizations, complexity requires enabling technology (see below) to support the process participants in these daily process management challenges.

To date, many organizations often start a BPM project or program with the objective of optimizing an area that has been identified as an area for improvement.

Currently, the international standards for the task have limited BPM to the application in the IT sector, and ISO/IEC 15944 covers the operational aspects of the business. However, some corporations with the culture of best practices do use standard operating procedures to regulate their operational process. Other standards are currently being worked upon to assist in BPM implementation (BPMN, enterprise architecture, Business Motivation Model).

BPM is now considered a critical component of operational intelligence (OI) solutions to deliver real-time, actionable information. This real-time information can be acted upon in a variety of ways – alerts can be sent or executive decisions can be made using real-time dashboards. OI solutions use real-time information to take automated action based on pre-defined rules so that security measures and or exception management processes can be initiated.
Because “the size and complexity of daily tasks often requires the use of technology to model efficiently” when resources in technology became increasingly widespread with general availability to businesses to provide to their staff, “Many thought BPM as the bridge between Information Technology (IT) and Business.”

There are four critical components of a BPM Suite:

BPM also addresses many of the critical IT issues underpinning these business drivers, including:

Validation of BPMS is another technical issue that vendors and users must be aware of, if regulatory compliance is mandatory. The validation task could be performed either by an authenticated third party or by the users themselves. Either way, validation documentation must be generated. The validation document usually can either be published officially or retained by users.

Cloud computing business process management is the use of (BPM) tools that are delivered as software services (SaaS) over a network. Cloud BPM business logic is deployed on an application server and the business data resides in cloud storage.

According to Gartner, 20% of all the “shadow business processes” are supported by BPM cloud platforms. Gartner refers to all the hidden organizational processes that are supported by IT departments as part of legacy business processes such as Excel spreadsheets, routing of emails using rules, phone calls routing, etc. These can, of course also be replaced by other technologies such as workflow and smart form software.

The benefits of using cloud BPM services include removing the need and cost of maintaining specialized technical skill sets in-house and reducing distractions from an enterprise’s main focus. It offers controlled IT budgeting and enables geographical mobility.[full citation needed].

The emerging Internet of things poses a significant challenge to control and manage the flow of information through large numbers of devices. To cope with this, a new direction known as BPM Everywhere shows promise as a way of blending traditional process techniques, with additional capabilities to automate the handling of all the independent devices.

Agile Software Development

Agile software development comprises various approaches to software development under which requirements and solutions evolve through the collaborative effort of self-organizing and cross-functional teams and their customer(s)/end user(s). It advocates adaptive planning, evolutionary development, early delivery, and continual improvement, and it encourages rapid and flexible response to change.[further explanation needed]

The term agile (sometimes written Agile) was popularized, in this context, by the Manifesto for Agile Software Development. The values and principles espoused in this manifesto were derived from and underpin a broad range of software development frameworks, including Scrum and Kanban.

While there is much anecdotal evidence that adopting agile practices and values improves the agility of software professionals, teams and organizations, some empirical studies have disputed that evidence.

Iterative and incremental development methods can be traced back as early as 1957, with evolutionary project management and adaptive software development emerging in the early 1970s.

During the 1990s, a number of lightweight software development methods evolved in reaction to the prevailing heavyweight methods that critics described as overly regulated, planned, and micro-managed. These included: rapid application development (RAD), from 1991; the unified process (UP) and dynamic systems development method (DSDM), both from 1994; Scrum, from 1995; Crystal Clear and extreme programming (XP), both from 1996; and feature-driven development, from 1997. Although these all originated before the publication of the Agile Manifesto, they are now collectively referred to as agile software development methods. At the same time, similar changes were underway in manufacturing and aerospace.

In 2001, these seventeen software developers met at a resort in Snowbird, Utah to discuss these lightweight development methods: Kent Beck, Ward Cunningham, Dave Thomas, Jeff Sutherland, Ken Schwaber, Jim Highsmith, Alistair Cockburn, Robert C. Martin, Mike Beedle, Arie van Bennekum, Martin Fowler, James Grenning, Andrew Hunt, Ron Jeffries, Jon Kern, Brian Marick, and Steve Mellor. Together they published the Manifesto for Agile Software Development.

In 2005, a group headed by Cockburn and Highsmith wrote an addendum of project management principles, the PM Declaration of Interdependence, to guide software project management according to agile software development methods.

In 2009, a group working with Martin wrote an extension of software development principles, the Software Craftsmanship Manifesto, to guide agile software development according to professional conduct and mastery.

In 2011, the Agile Alliance created the Guide to Agile Practices (renamed the Agile Glossary in 2016), an evolving open-source compendium of the working definitions of agile practices, terms, and elements, along with interpretations and experience guidelines from the worldwide community of agile practitioners.

Based on their combined experience of developing software and helping others do that, the seventeen signatories to the manifesto proclaimed that they value:

That is to say, the items on the left are valued more than the items on the right.

As Scott Ambler elucidated:

Some of the authors formed the Agile Alliance, a non-profit organization that promotes software development according to the manifesto’s values and principles. Introducing the manifesto on behalf of the Agile Alliance, Jim Highsmith said,

The Agile movement is not anti-methodology, in fact many of us want to restore credibility to the word methodology. We want to restore a balance. We embrace modeling, but not in order to file some diagram in a dusty corporate repository. We embrace documentation, but not hundreds of pages of never-maintained and rarely-used tomes. We plan, but recognize the limits of planning in a turbulent environment. Those who would brand proponents of XP or SCRUM or any of the other Agile Methodologies as “hackers” are ignorant of both the methodologies and the original definition of the term hacker.

The Manifesto for Agile Software Development is based on twelve principles:

Most agile development methods break product development work into small increments that minimize the amount of up-front planning and design. Iterations, or sprints, are short time frames (timeboxes) that typically last from one to four weeks. Each iteration involves a cross-functional team working in all functions: planning, analysis, design, coding, unit testing, and acceptance testing. At the end of the iteration a working product is demonstrated to stakeholders. This minimizes overall risk and allows the product to adapt to changes quickly. An iteration might not add enough functionality to warrant a market release, but the goal is to have an available release (with minimal bugs) at the end of each iteration. Multiple iterations might be required to release a product or new features. Working software is the primary measure of progress.

The principle of co-location is that co-workers on the same team should be situated together to better establish the identity as a team and to improve communication. This enables face-to-face interaction, ideally in front of a whiteboard, that reduces the cycle time typically taken when questions and answers are mediated through phone, persistent chat, wiki, or email.

No matter which development method is followed, every team should include a customer representative (“Product Owner” in Scrum). This person is agreed by stakeholders to act on their behalf and makes a personal commitment to being available for developers to answer questions throughout the iteration. At the end of each iteration, stakeholders and the customer representative review progress and re-evaluate priorities with a view to optimizing the return on investment (ROI) and ensuring alignment with customer needs and company goals.

In agile software development, an information radiator is a (normally large) physical display located prominently near the development team, where passers-by can see it. It presents an up-to-date summary of the product development status. A build light indicator may also be used to inform a team about the current status of their product development.

A common characteristic in agile software development is the daily stand-up (also known as the daily scrum). In a brief session, team members report to each other what they did the previous day toward their team’s iteration goal, what they intend to do today toward the goal, and any roadblocks or impediments they can see to the goal.

Specific tools and techniques, such as continuous integration, automated unit testing, pair programming, test-driven development, design patterns, behavior-driven development, domain-driven design, code refactoring and other techniques are often used to improve quality and enhance product development agility. This is predicated on designing and building quality in from the beginning and being able to demonstrate software for customers at any point, or at least at the end of every iteration.

Compared to traditional software engineering, agile software development mainly targets complex systems and product development with dynamic, non-deterministic and non-linear characteristics. Accurate estimates, stable plans, and predictions are often hard to get in early stages, and confidence in them is likely to be low. Agile practitioners will seek to reduce the leap-of-faith that is needed before any evidence of value can be obtained. Requirements and design are held to be emergent. Big up-front specifications would probably cause a lot of waste in such cases, i.e., are not economically sound. These basic arguments and previous industry experiences, learned from years of successes and failures, have helped shape agile development’s favor of adaptive, iterative and evolutionary development.

Development methods exist on a continuum from adaptive to predictive. Agile software development methods lie on the adaptive side of this continuum. One key of adaptive development methods is a rolling wave approach to schedule planning, which identifies milestones but leaves flexibility in the path to reach them, and also allows for the milestones themselves to change.

Adaptive methods focus on adapting quickly to changing realities. When the needs of a project change, an adaptive team changes as well. An adaptive team has difficulty describing exactly what will happen in the future. The further away a date is, the more vague an adaptive method is about what will happen on that date. An adaptive team cannot report exactly what tasks they will do next week, but only which features they plan for next month. When asked about a release six months from now, an adaptive team might be able to report only the mission statement for the release, or a statement of expected value vs. cost.

Predictive methods, in contrast, focus on analysing and planning the future in detail and cater for known risks. In the extremes, a predictive team can report exactly what features and tasks are planned for the entire length of the development process. Predictive methods rely on effective early phase analysis and if this goes very wrong, the project may have difficulty changing direction. Predictive teams often institute a change control board to ensure they consider only the most valuable changes.

Risk analysis can be used to choose between adaptive (agile or value-driven) and predictive (plan-driven) methods. Barry Boehm and Richard Turner suggest that each side of the continuum has its own home ground, as follows:

One of the differences between agile software development methods and waterfall is the approach to quality and testing. In the waterfall model, there is always a separate testing phase after a build phase; however, in agile software development testing is completed in the same iteration as programming.

Another difference is that traditional “waterfall” software development moves a project through various Software Development Lifecycle (SDLC) phases. One phase is completed in its entirety before moving on to the next phase.

Because testing is done in every iteration—which develops a small piece of the software—users can frequently use those new pieces of software and validate the value. After the users know the real value of the updated piece of software, they can make better decisions about the software’s future. Having a value retrospective and software re-planning session in each iteration—Scrum typically has iterations of just two weeks—helps the team continuously adapt its plans so as to maximize the value it delivers. This follows a pattern similar to the PDCA cycle, as the work is planned, done, checked (in the review and retrospective), and any changes agreed are acted upon.

This iterative approach supports a product rather than a project mindset. This provides greater flexibility throughout the development process; whereas on projects the requirements are defined and locked down from the very beginning, making it difficult to change them later. Iterative product development allows the software to evolve in response to changes in business environment or market requirements.

Because of the short iteration style of agile software development, it also has strong connections with the lean startup concept.

In a letter to IEEE Computer, Steven Rakitin expressed cynicism about agile software development, calling it “yet another attempt to undermine the discipline of software engineering” and translating “working software over comprehensive documentation” as “we want to spend all our time coding. Remember, real programmers don’t write documentation.”

This is disputed by proponents of agile software development, who state that developers should write documentation if that is the best way to achieve the relevant goals, but that there are often better ways to achieve those goals than writing static documentation.
Scott Ambler states that documentation should be “just barely good enough” (JBGE), that too much or comprehensive documentation would usually cause waste, and developers rarely trust detailed documentation because it’s usually out of sync with code, while too little documentation may also cause problems for maintenance, communication, learning and knowledge sharing. Alistair Cockburn wrote of the Crystal Clear method:

Crystal considers development a series of co-operative games, and intends that the documentation is enough to help the next win at the next game. The work products for Crystal include use cases, risk list, iteration plan, core domain models, and design notes to inform on choices…however there are no templates for these documents and descriptions are necessarily vague, but the objective is clear, just enough documentation for the next game. I always tend to characterize this to my team as: what would you want to know if you joined the team tomorrow.

Agile software development methods support a broad range of the software development life cycle. Some focus on the practices (e.g., XP, pragmatic programming, agile modeling), while some focus on managing the flow of work (e.g., Scrum, Kanban). Some support activities for requirements specification and development (e.g., FDD), while some seek to cover the full development life cycle (e.g., DSDM, RUP).

Notable agile software development frameworks include:

Agile software development is supported by a number of concrete practices, covering areas like requirements, design, modeling, coding, testing, planning, risk management, process, quality, etc. Some notable agile software development practices include:

In the literature, different terms refer to the notion of method adaptation, including ‘method tailoring’, ‘method fragment adaptation’ and ‘situational method engineering’. Method tailoring is defined as:

A process or capability in which human agents determine a system development approach for a specific project situation through responsive changes in, and dynamic interplays between contexts, intentions, and method fragments.

Situation-appropriateness should be considered as a distinguishing characteristic between agile methods and more plan-driven software development methods, with agile methods allowing product development teams to adapt working practices according to the needs of individual products. Potentially, most agile methods could be suitable for method tailoring, such as DSDM tailored in a CMM context. and XP tailored with the Rule Description Practices (RDP) technique. Not all agile proponents agree, however, with Schwaber noting “that is how we got into trouble in the first place, thinking that the problem was not having a perfect methodology. Efforts [should] center on the changes [needed] in the enterprise”. Bas Vodde reinforced this viewpoint, suggesting that unlike traditional, large methodologies that require you to pick and choose elements, Scrum provides the basics on top of which you add additional elements to localise and contextualise its use. Practitioners seldom use system development methods, or agile methods specifically, by the book, often choosing to omit or tailor some of the practices of a method in order to create an in-house method.

In practice, methods can be tailored using various tools. Generic process modeling languages such as Unified Modeling Language can be used to tailor software development methods. However, dedicated tools for method engineering such as the Essence Theory of Software Engineering of SEMAT also exist.

Agile software development has been widely seen as highly suited to certain types of environments, including small teams of experts working on greenfield projects,:157 and the challenges and limitations encountered in the adoption of agile software development methods in a large organization with legacy infrastructure are well-documented and understood.

In response, a range of strategies and patterns has evolved for overcoming challenges with large-scale development efforts (>20 developers) or distributed (non-colocated) development teams, amongst other challenges; and there are now several recognised frameworks that seek to mitigate or avoid these challenges.

There are many conflicting viewpoints on whether all of these are effective or indeed fit the definition of agile development, and this remains an active and ongoing area of research.

When agile software development is applied in a distributed setting (with teams dispersed across multiple business locations), it is commonly referred to as distributed agile development. The goal is to leverage the unique benefits offered by each approach. Distributed development allows organizations to build software by strategically setting up teams in different parts of the globe, virtually building software round-the-clock (more commonly referred to as follow-the-sun model). On the other hand, agile development provides increased transparency, continuous feedback and more flexibility when responding to changes.

Agile software development methods were initially seen as best suitable for non-critical product developments, thereby excluded from use in regulated domains such as medical devices, pharmaceutical, financial, nuclear systems, automotive, and avionics sectors, etc. However, in the last several years, there have been several initiatives for the adaptation of agile methods for these domains.

There are numerous standards that may apply in regulated domains, including ISO 26262, ISO 9000, ISO 9001, and ISO/IEC 15504.
A number of key concerns are of particular importance in regulated domains:

Although agile software development methods can be used with any programming paradigm or language in practice, they were originally closely associated with object-oriented environments such as Smalltalk and Lisp and later Java. The initial adopters of agile methods were usually small to medium-sized teams working on unprecedented systems with requirements that were difficult to finalize and likely to change as the system was being developed. This section describes common problems that organizations encounter when they try to adopt agile software development methods as well as various techniques to measure the quality and performance of agile teams.

The best agile practitioners have always emphasized thorough engineering principles. As a result, there are a number of best practices and tools for measuring the performance of agile software development and teams.

The Agility measurement index, amongst others, rates developments against five dimensions of product development (duration, risk, novelty, effort, and interaction). Other techniques are based on measurable goals and one study suggests that velocity can be used as a metric of agility. There are also agile self-assessments to determine whether a team is using agile software development practices (Nokia test, Karlskrona test, 42 points test).

One of the early studies reporting gains in quality, productivity, and business satisfaction by using agile software developments methods was a survey conducted by Shine Technologies from November 2002 to January 2003.

A similar survey, the State of Agile, is conducted every year starting in 2006 with thousands of participants from around the software development community. This tracks trends on the benefits of agility, lessons learned, and good practices. Each survey has reported increasing numbers saying that agile software development helps them deliver software faster; improves their ability to manage changing customer priorities; and increases their productivity. Surveys have also consistently shown better results with agile product development methods compared to classical project management. In balance, there are reports that some feel that agile development methods are still too young to enable extensive academic research of their success.

Organizations and teams implementing agile software development often face difficulties transitioning from more traditional methods such as waterfall development, such as teams having an agile process forced on them. These are often termed agile anti-patterns or more commonly agile smells. Below are some common examples:

A goal of agile software development is to focus more on producing working software and less on documentation. This is in contrast to waterfall models where the process is often highly controlled and minor changes to the system require significant revision of supporting documentation. However, this does not justify completely doing without any analysis or design at all. Failure to pay attention to design can cause a team to proceed rapidly at first but then to have significant rework required as they attempt to scale up the system. One of the key features of agile software development is that it is iterative. When done correctly design emerges as the system is developed and commonalities and opportunities for re-use are discovered.

In agile software development, stories (similar to use case descriptions) are typically used to define requirements and an iteration is a short period of time during which the team commits to specific goals. Adding stories to an iteration in progress is detrimental to a good flow of work. These should be added to the product backlog and prioritized for a subsequent iteration or in rare cases the iteration could be cancelled.

This does not mean that a story cannot expand. Teams must deal with new information, which may produce additional tasks for a story. If the new information prevents the story from being completed during the iteration, then it should be carried over to a subsequent iteration. However, it should be prioritized against all remaining stories, as the new information may have changed the story’s original priority.

Agile software development is often implemented as a grassroots effort in organizations by software development teams trying to optimize their development processes and ensure consistency in the software development life cycle. By not having sponsor support, teams may face difficulties and resistance from business partners, other development teams and management. Additionally, they may suffer without appropriate funding and resources. This increases the likelihood of failure.

A survey performed by VersionOne found respondents cited insufficient training as the most significant cause for failed agile implementations Teams have fallen into the trap of assuming the reduced processes of agile software development compared to other methodologies such as waterfall means that there are no actual rules for agile software development.

The product owner is responsible for representing the business in the development activity and is often the most demanding role.

A common mistake is to have the product owner role filled by someone from the development team. This requires the team to make its own decisions on prioritization without real feedback from the business. They try to solve business issues internally or delay work as they reach outside the team for direction. This often leads to distraction and a breakdown in collaboration.

Agile software development requires teams to meet product commitments, which means they should focus only on work for that product. However, team members who appear to have spare capacity are often expected to take on other work, which makes it difficult for them to help complete the work to which their team had committed.

Teams may fall into the trap of spending too much time preparing or planning. This is a common trap for teams less familiar with agile software development where the teams feel obliged to have a complete understanding and specification of all stories. Teams should be prepared to move forward only with those stories in which they have confidence, then during the iteration continue to discover and prepare work for subsequent iterations (often referred to as backlog refinement or grooming).

A daily standup should be a focused, timely meeting where all team members disseminate information. If problem-solving occurs, it often can only involve certain team members and potentially is not the best use of the entire team’s time. If during the daily standup the team starts diving into problem-solving, it should be set aside until a sub-team can discuss, usually immediately after the standup completes.

One of the intended benefits of agile software development is to empower the team to make choices, as they are closest to the problem. Additionally, they should make choices as close to implementation as possible, to use more timely information in the decision. If team members are assigned tasks by others or too early in the process, the benefits of localized and timely decision making can be lost.

Being assigned work also constrains team members into certain roles (for example, team member A must always do the database work), which limits opportunities for cross-training. Team members themselves can choose to take on tasks that stretch their abilities and provide cross-training opportunities.

Another common pitfall is for a scrum master to act as a contributor. While not prohibited by the Scrum methodology, the scrum master needs to ensure they have the capacity to act in the role of scrum master first and not working on development tasks. A scrum master’s role is to facilitate the process rather than create the product.

Having the scrum master also multitasking may result in too many context switches to be productive. Additionally, as a scrum master is responsible for ensuring roadblocks are removed so that the team can make forward progress, the benefit gained by individual tasks moving forward may not outweigh roadblocks that are deferred due to lack of capacity.

Due to the iterative nature of agile development, multiple rounds of testing are often needed. Automated testing helps reduce the impact of repeated unit, integration, and regression tests and frees developers and testers to focus on higher value work.

Test automation also supports continued refactoring required by iterative software development. Allowing a developer to quickly run tests to confirm refactoring has not modified the functionality of the application may reduce the workload and increase confidence that cleanup efforts have not introduced new defects.

Focusing on delivering new functionality may result in increased technical debt. The team must allow themselves time for defect remediation and refactoring. Technical debt hinders planning abilities by increasing the amount of unscheduled work as production defects distract the team from further progress.

As the system evolves it is important to refactor as entropy of the system naturally increases. Over time the lack of constant maintenance causes increasing defects and development costs.

A common misconception is that agile software development allows continuous change, however an iteration backlog is an agreement of what work can be completed during an iteration. Having too much work-in-progress (WIP) results in inefficiencies such as context-switching and queueing. The team must avoid feeling pressured into taking on additional work.

Agile software development fixes time (iteration duration), quality, and ideally resources in advance (though maintaining fixed resources may be difficult if developers are often pulled away from tasks to handle production incidents), while the scope remains variable. The customer or product owner often push for a fixed scope for an iteration. However, teams should be reluctant to commit to the locked time, resources and scope (commonly known as the project management triangle). Efforts to add scope to the fixed time and resources of agile software development may result in decreased quality.

Due to the focused pace and continuous nature of agile practices, there is a heightened risk of burnout among members of the delivery team.

The term agile management is applied to an iterative, incremental method of managing the design and build activities of engineering, information technology and other business areas that aim to provide new product or service development in a highly flexible and interactive manner, based on the principles expressed in the Manifesto for Agile Software Development.

Agile X techniques may also be called extreme project management. It is a variant of iterative life cycle where deliverables are submitted in stages. The main difference between agile and iterative development is that agile methods complete small portions of the deliverables in each delivery cycle (iteration), while iterative methods evolve the entire set of deliverables over time, completing them near the end of the project. Both iterative and agile methods were developed as a reaction to various obstacles that developed in more sequential forms of project organization. For example, as technology projects grow in complexity, end users tend to have difficulty defining the long-term requirements without being able to view progressive prototypes. Projects that develop in iterations can constantly gather feedback to help refine those requirements.

Agile management also offers a simple framework promoting communication and reflection on past work amongst team members. Teams who were using traditional waterfall planning and adopted the agile way of development typically go through a transformation phase and often take help from agile coaches who help guide the teams through a smooth transformation. There are typically two styles of agile coaching: push-based and pull-based agile coaching. Agile management approaches have also been employed and adapted to the business and government sectors. For example, within the federal government of the United States, the United States Agency for International Development (USAID) is employing a collaborative project management approach that focuses on incorporating collaborating, learning and adapting (CLA) strategies to iterate and adapt programming.

Agile methods are mentioned in the Guide to the Project Management Body of Knowledge (PMBOK Guide) under the Project Lifecycle definition:

Adaptive project life cycle, a project life cycle, also known as change-driven or agile methods, that is intended to facilitate change and require a high degree of ongoing stakeholder involvement. Adaptive life cycles are also iterative and incremental, but differ in that iterations are very rapid (usually 2-4 weeks in length) and are fixed in time and resources.

According to Jean-Loup Richet (Research Fellow at ESSEC Institute for Strategic Innovation & Services) “this approach can be leveraged effectively for non-software products and for project management in general, especially in areas of innovation and uncertainty.” The end result is a product or project that best meets current customer needs and is delivered with minimal costs, waste, and time, enabling companies to achieve bottom line gains earlier than via traditional approaches.

Agile software development methods have been extensively used for development of software products and some of them use certain characteristics of software, such as object technologies. However, these techniques can be applied to the development of non-software products, such as computers, motor vehicles, medical devices, food, clothing, and music. Agile software development methods have been used in non-development IT infrastructure deployments and migrations. Some of the wider principles of agile software development have also found application in general management (e.g., strategy, governance, risk, finance) under the terms business agility or agile business management.

Under an agile business management model, agile software development techniques, practices, principles and values are expressed across five domains.

Agile software development paradigms can be used in other areas of life such as raising children. Its success in child development might be founded on some basic management principles; communication, adaptation, and awareness. In a TED Talk, Bruce Feiler shared how he applied basic agile paradigms to household management and raising children.

Agile practices can be inefficient in large organizations and certain types of developments. Many organizations believe that agile software development methodologies are too extreme and adopt a Hybrid approach that mixes elements of agile software development and plan-driven approaches. Some methods, such as dynamic systems development method (DSDM) attempt this in a disciplined way, without sacrificing fundamental principles.

The increasing adoption of agile practices has also been criticized as being a management fad that simply describes existing good practices under new jargon, promotes a one size fits all mindset towards development strategies, and wrongly emphasizes method over results.

Alistair Cockburn organized a celebration of the 10th anniversary of the Manifesto for Agile Software Development in Snowbird, Utah on 12 February 2011, gathering some 30+ people who had been involved at the original meeting and since. A list of about 20 elephants in the room (‘undiscussable’ agile topics/issues) were collected, including aspects: the alliances, failures and limitations of agile software development practices and context (possible causes: commercial interests, decontextualization, no obvious way to make progress based on failure, limited objective evidence, cognitive biases and reasoning fallacies), politics and culture. As Philippe Kruchten wrote:

The agile movement is in some ways a bit like a teenager: very self-conscious, checking constantly its appearance in a mirror, accepting few criticisms, only interested in being with its peers, rejecting en bloc all wisdom from the past, just because it is from the past, adopting fads and new jargon, at times cocky and arrogant. But I have no doubts that it will mature further, become more open to the outside world, more reflective, and therefore, more effective.

Technology Management

Technology management is a set of management disciplines that allows organizations to manage their technological fundamentals to create competitive advantage. Typical concepts used in technology management are:

The role of the technology management function in an organization is to understand the value of certain technology for the organization. Continuous development of technology is valuable as long as there is a value for the customer and therefore the technology management function in an organization should be able to argue when to invest on technology development and when to withdraw.

Technology management can also be defined as the integrated planning, design, optimization, operation and control of technological products, processes and services, a better definition would be the management of the use of technology for human advantage.

The Association of Technology, Management, and Applied Engineering defines technology management as the field concerned with the supervision of personnel across the technical spectrum and a wide variety of complex technological systems. Technology management programs typically include instruction in production and operations management, project management, computer applications, quality control, safety and health issues, statistics, and general management principles.

Perhaps the most authoritative input to our understanding of technology is the diffusion of innovations theory developed in the first half of the twentieth century. It suggests that all innovations follow a similar diffusion pattern – best known today in the form of an “s” curve though originally based upon the concept of a standard distribution of adopters. In broad terms the “s” curve suggests four phases of a technology life cycle – emerging, growth, mature and aging.

These four phases are coupled to increasing levels of acceptance of an innovation or, in our case a new technology. In recent times for many technologies an inverse curve – which corresponds to a declining cost per unit – has been postulated. This may not prove to be universally true though for information technology where much of the cost is in the initial phase it has been a reasonable expectation.

The second major contribution to this area is the Carnegie Mellon Capability Maturity Model. This model proposes that a series of progressive capabilities can be quantified through a set of threshold tests. These tests determine repeatability, definition, management and optimization. The model suggests that any organization has to master one level before being able to proceed to the next.

The third significant contribution comes from Gartner – the research service, it is the Hype cycle, this suggests that our modern approach to marketing technology results in the technology being over hyped in the early stages of growth. Taken together, these fundamental concepts provide a foundation for formalizing the approach to managing technology.

Mobile device management (MDM) is the administrative area dealing with deploying, securing, monitoring, integrating and managing mobile devices, such as smartphones, tablets and laptops, in the workplace and other areas. The intent of MDM is to optimize the functionality and security of mobile devices within the enterprise, while simultaneously protecting the corporate network. MDM is usually implemented with the use of a third party product that has management features for particular vendors of mobile devices.

Modern Mobile Device Management products supports tablets, Windows 10 and macOS computers. The practice of using MDM to control PC is also known as unified endpoint management.

The Association of Technology, Management, and Applied Engineering (ATMAE), accredits selected collegiate programs in technology management. An instructor or graduate of a technology management program may choose to become a Certified Technology Manager (CTM) by sitting for a rigorous exam administered by ATMAE covering production planning & control, safety, quality, and management/supervision.

ATMAE program accreditation is recognized by the Council for Higher Education Accreditation (CHEA) for accrediting technology management programs. CHEA recognizes ATMAE in the U.S. for accrediting associate, baccalaureate, and master’s degree programs in technology, applied technology, engineering technology, and technology-related disciplines delivered by national or regional accredited institutions in
the United States.(2011)