Lean Thinking

Lean thinking is a business methodology that aims to provide a new way to think about how to organize human activities to deliver more benefits to society and value to individuals while eliminating waste. The term “lean thinking” was coined by James P. Womack and Daniel T. Jones to capture the essence of their in-depth study of Toyota’s fabled Toyota Production System. Lean thinking is a way of thinking about an activity and seeing the waste inadvertently generated by the way the process is organized. It uses the concepts of:

  • Value
  • Value streams
  • Flow

In the business world and has evolved in two different directions:

  • Lean thinking converts who keep seeking to understand how to seek dynamic gains rather than static efficiencies. For this group of thinkers, lean thinking continuously evolves as they seek to better understand the possibilities of the way opened up by Toyota and have grasped the fact that the aim of continuous improvement is continuous improvement. Lean thinking as such is a movement of practitioners and writers who experiment and learn in different industries and conditions, to lean think any new activity.
  • Lean manufacturing adepts who have interpreted the term “lean” as a form of operational excellence and have turned to company programs aimed at taking costs out of processes. Lean activities are used to improve processes without ever challenging the underlying thinking, with powerful low-hanging fruit results but little hope of transforming the enterprise as a whole. This “corporate lean” approach is fundamentally opposed to the ideals of lean thinking, but has been taken up by a great number of large businesses seeking to cut their costs without challenging their fundamental management assumptions.

Lean thinking would challenge line managers to look differently at their own jobs by focusing on:

  • The workplace: Going and seeing first hand work conditions in practice, right now, and finding out the facts for oneself rather than relying on reports and boardroom meeting. The workplace is also where real people make real value and going to see is a mark of respect and the opportunity to support employees to add value through their ideas and initiative more than merely make value through prescribed work. The management revolution brought by lean thinking can be summed up by describing jobs in terms of Job = Work + Kaizen
  • Value through built-in quality: Understanding that customer satisfaction is paramount and is built-in at every step of the enterprise’s process, from building in satisfying features (such as peace of mind) to correctly building in quality at every production step. Built-in quality means to stop at every doubtful part and to train yourself and others not to pass on defective work, not to do defective work and not to accept defective work by stopping the process and reacting immediately whenever things go wrong.
  • Value streams through understanding “takt” time: By calculating the ratio of open production time to averaged customer demand one can have a clear idea of the capacity needed to offer a steady flow of products. This “takt” rhythm, be it a minute for cars, two months for software projects or two years for a new book leads to creating stable value streams where stable teams work on a stable set of products with stable equipment rather than optimize the use of specific machines or processes. Takt time thinking leads to completely different capacity reasoning than traditional costing and is the key to far more frugal processes.
  • Flow through reducing batch sizes: Every traditional business, whether in production or services, is addicted to batch. The idea is that once work is set up one way, we’d better get on and quickly make as many pieces of work as we can to keep the unit cost down. Lean thinking looks at this differently in trying to optimize the flow of work in order to satisfy real demand now, not imaginary demand next month. By working strenuously on reducing change-over time and difficulty, it is possible to approach the lean thinking ideal of single piece flow. In doing so, one reduces dramatically the general cost of the business by eliminating the need for warehouses, transports, systems, subcontractor use and so on.
  • Pull to visualize takt time through the flow: pulling work from upstream at takt time through visual devices such as Kanban cards is the essential piece that enables lean thinkers to visualize the gaps between the ideal and the actual at the workplace at any time. Pull is what creates a creative tension in the workplace by both edging closer to single-piece-work and by highlighting problems one at a time as they occur so complex situations can be resolved piecemeal. Pull is the basic technique to “lean” the company and, by and large, without pull there is no lean thinking.
  • Seeking perfection through kaizen: The old time sensei used to teach that the aim of lean thinking was not to apply lean tools to every process, but to develop the kaizen spirit in every employee. Perfection is not sought through better, more clever systems or go-it-alone heroes but through a commitment to improve things together step-by-small-step. Kaizen literally means change for the better and Kaizen spirit is about seeking a hundred 1% improvements from everyone every day everywhere rather than one 100% leap forward. The practice of kaizen is what anchors deep lean thinking in people’s minds and which, ultimately, leads to complete transformation. Practising kaizen together builds self-confidence and the collective confidence that we can face our larger challenges and solve our problems together.

Advantages

Worker Satisfaction

Implementing lean principles in your company requires input and participation from your production staff. They are often in the best place to see where waste and inefficiency occurs. Not only do they serve as a resource for you, employees usually respond in a positive way to sincere efforts to involve them in improvement processes. When they see suggestions and ideas incorporated, a sense of ownership and satisfaction about their contribution is more likely to follow.

Eliminates Waste

Lean principles aim to minimize all forms of waste, from sources as varied as material defects to worker ergonomics. Many sources of waste are easy to identify and correct, such as a machine that is out of adjustment, producing a high volume of defects. Other forms of waste include environmental conditions that impede worker efficiency. Better lighting may help a worker read production instructions; moving a file cabinet might eliminate wasted time for a clerk.

Just in Time

JIT is a strategy that suggests large inventories are wasteful of company resources. Business equity tied up in inventories of raw and finished goods interferes with cash flow. Money is also saved through reduced warehousing needs. The perfect JIT scenario would have the raw materials purchased and delivered at the moment production needs them, and the finished product is sold and delivered the moment it comes off the line. While this scenario may be impossible, lean philosophy suggests making improvements toward the ideal.

Competitive Advantage

Beyond simply reducing costs and improving efficiency, lean production techniques introduce systems and develop skills with your staff that support changes in the workplace that new sales create. Space saved on warehousing may be used to add new product lines. The same is true of time savings. Your staff can absorb new work and react quickly to changes in client demand. Producing work quickly, in short iterations, without waste and delivered on time enhances your advantage over your competition.

Disadvantages

Low Margin for Error

JIT principles work best with stable system components. Delivery times for raw and finished goods are known, and the elements of production can be scheduled accordingly. Being overly aggressive with JIT scheduling leaves you vulnerable to systemic bottlenecks. Supplier delivery issues may cut off your raw materials, interrupting your production flow. Maintenance emergencies can reduce your production throughput. Any constraint not accounted for in your JIT planning potentially jeopardizes the entire system. Margin for error and system waste may be difficult to balance.

New Inefficiencies

Lean techniques can be overused. When tracking of productivity and waste starts to impact the time used for production, the solution becomes the problem. When lean principles are first applied, you can expect larger returns than later down the road. It is tempting to push those expectations, but you must examine the value of improvements. If you refine throughput to 1,000 parts an hour in one section that you can supply with only 500 parts from a previous stage, you haven’t improved your result.

Worker Frustration

When a certain level of refinement is met, using lean methods to squeeze more economy from production can discourage workers, reversing positive motivation and undermining your leadership. Trends of backsliding in previous improvements may indicate worker resentment. Striking a balance between stasis and continuous improvement is a challenge in any lean environment. A small business may be more prone to reaching such a refinement because of its less complex nature. Be aware of how incorporated changes affect your staff to gauge how effective further pushes will be.

Malcolm Baldrige National Quality Award (MBNQA)

The Baldrige Excellence Framework has three parts:

(1) The criteria for performance excellence

(2) Core values and concepts

(3) Scoring guidelines.

The framework serves two main purposes:

(1) To help organizations assess their improvement efforts, diagnose their overall performance management system, and identify their strengths and opportunities for improvement

(2) To identify Baldrige Award recipients that will serve as role models for other organizations.

The Malcolm Baldrige National Quality Award (MBNQA) is an award established by the U.S. Congress in 1987 to raise awareness of quality management and recognize U.S. companies that have implemented successful quality management systems. The award is the nation’s highest presidential honor for performance excellence.

Purpose of the Malcolm Baldrige Award

  • Raise awareness about the importance of performance excellence.
  • Recognize companies that show performance excellence and pass on this information to other organizations to tailor it for their own needs.
  • Motivate U.S. companies and organizations to improve their quality standards and strive for excellence.
  • Help companies and organizations embody the competitive spirit and drive the U.S. economy forward.

Three MBNQA awards can be given annually in six categories:

  • Manufacturing
  • Service Company
  • Small Business
  • Education
  • Healthcare
  • Non-profit

The criteria for performance excellence are based on a set of core values:

  • Systems perspective
  • Visionary leadership
  • Customer-focused excellence
  • Valuing people
  • Organizational learning and agility
  • Focus on success
  • Managing for innovation
  • Management by fact
  • Societal responsibility
  • Ethics and transparency
  • Delivering value and results

The questions that make up the criteria represent seven aspects of organizational management and performance:

Organizations that apply for the MBNQA are judged by an independent board of examiners. Recipients are selected based on achievement and improvement in seven areas, known as the Baldrige Criteria for Performance Excellence:

  • Leadership: How upper management leads the organization, and how the organization leads within the community.
  • Strategy: How the organization establishes and plans to implement strategic directions.
  • Customers: How the organization builds and maintains strong, lasting relationships with customers.
  • Measurement, analysis, and knowledge management: How the organization uses data to support key processes and manage performance.
  • Workforce: How the organization empowers and involves its workforce.
  • Operations: How the organization designs, manages, and improves key processes.
  • Results: How the organization performs in terms of customer satisfaction, finances, human resources, supplier and partner performance, operations, governance and social responsibility, and how the organization compares to its competitors.

The three sector-specific versions of the Baldrige framework are revised every two years:

  • Baldrige Excellence Framework (Business/Nonprofit)
  • Baldrige Excellence Framework (Education)
  • Baldrige Excellence Framework (Health Care)

Sigma features, Enablers, Goals, DMAIC/DMADV

Sigma features

Inventory management plays two critical roles in Lean Six Sigma. Firstly, the management of raw materials and semi-finished goods in the lean manufacturing process. Secondly, inventory control of finished goods held in a warehouse by manufacturers. In both cases, you should keep a sufficient stock level to minimize inventory carrying costs.

As a component of lean manufacturing, stock levels of raw materials and semi-finished products need to be controlled and managed. Similar to Japanese kaizen and Just-in-Time principles, just the right amount of stock should be held to keep production schedules going. This will reduce waste and optimize the production workflow.

For finished goods, excess and obsolete inventory are major business issues. These often involve deadstock and seasonality. A typical solution addresses the excess inventory. You might sell them below cost or donate them, for example. But this does not address their root cause.

  • Identify the value that a company will get from Lean Inventory Management.
  • Optimize the flow of inventory through the business by removing obstacles in the way. This comes from the Japanese 5S Lean principle (Sort, Straighten, Sweep, Standardize, Sustain).
  • Move inventory only when requested by the customer. This is adapted from the Kanban Lean principle.
  • Be flexible and adapt to change. This is influenced by the Kaizen Lean principle.
  • Continuously refine your inventory management processes to improve quality, cycle time, efficiency and cost. This is derived from the DMAIC Six Sigma methodology.

Principles

  • Demand management. You should only move inventory upon an order by a customer.
  • Cost and waste reduction. But not to the extent of negatively affecting the customer.
  • Process standardization. Standardizing, for example, on transportation and business processes.
  • Industry standardization. Standardizing on product parts and components.
  • Cultural change. Everyone along the supply chain must work as a team. Similarly, this echoes principles from Just-in-Time manufacturing.
  • Cross-organization collaboration. Teams that cut across the organization can help to understand value better. Thus, it provides a holistic view similar to that of customer.

Enablers Goals

DMAIC/DMADV

DMAIC

Define: the problem, goal, reason the issue needs to be resolved.

Measure: the current state as a baseline and use it as a starting point for improvement.

Analyze: the root cause, identify with data driven tools and validate as to why said issue is happening.

Improve: here you need to identify some creative solutions to get rid of the major root causes, so the problem will be fixed and prevent future similar issues.

Control: here you want to maintain the improvements and sustain the success of those new improvements.

DMADV

DMADV is the acronym for the framework of Design for Six Sigma (DFSS), which is used when developing a brand-new service or product within a business. The acronym stands for Define, Measurement, Analysis, Design, and Verify.

Define: The goal of the new product or service, set realistic and measurable goals, why it is needed.

Measurement: You must know which factors that are critically important; this should include any parameters, including risks, as well as the production process and product capability.

Analysis: Here you develop design alternatives, work with different combinations and outcomes, and select the best components that would work.

Design: This is where a detailed prototype is developed. After this is done, a more detailed version is developed where errors may make it necessary to modify the current version.

Verify: Here is the final step where the newly designed product is taken to the real world test to see if it will work perfectly. Many production runs might be necessary to see if the quality is the absolute highest it can be.

DMAIC and DMADV do have a number of similarities that are worth noting. They both use statistical tools and facts in order find solutions to common quality-related problems and focus on reaching the business and financial goals of an organization. DMAIC and DMADV are implemented by Green Belts, Black Belts and Master Black Belts and are used to reduce defects to fewer than 3.4 per million available opportunities, or Six Sigma. Their solutions are data intensive and based only on hard facts.

The two most widely used Six Sigma methodologies are DMAIC and DMADV.  Both methods are designed so a business process will be more efficient and effective. While both of these methodologies share some important characteristics, they are not interchangeable and were developed for use in differing business processes.  Before comparing these two approaches in more detail, let’s review what the acronyms stand for.

  • DMAIC: Define, Measure, Analyze, Improve, Control
  • DMADV: Define, Measure, Analyze, Design, Verify

Despite the shared first three letters of their names, there are some notable differences between them. The main difference exists in the way the final two steps of the process are handled. With DMADV, the Design and Verify steps deal with redesigning a process to match customer needs, as opposed to the Improve and Control steps that focus on determining ways to readjust and control the process. DMAIC typically defines a business process and how applicable it is; DMADV defines the needs of the customer as they relate to a service or product.

With regards to measurement, DMAIC measures current performance of a process while DMADV measures customer specifications and needs. Control systems are established with DMAIC in order to keep check on the business’ future performance, while with DMADV, a suggested business model must undergo simulation tests to verify efficacy.

DMAIC concentrates on making improvements to a business process in order to reduce or eliminate defects; DMADV develops an appropriate business model destined to meet the customers’ requirements.

TAGUCHI’s Quality Engineering

The Taguchi method of quality control is an approach to engineering that emphasizes the roles of research and development (R&D), product design and development in reducing the occurrence of defects and failures in manufactured goods.

This method, developed by Japanese engineer and statistician Genichi Taguchi, considers design to be more important than the manufacturing process in quality control, aiming to eliminate variances in production before they can occur.

Taguchi methods are statistical methods, sometimes called robust design methods, developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to engineering, biotechnology, marketing and advertising. Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchi’s development of designs for studying variation, but have criticized the inefficiency of some of Taguchi’s proposals.

Taguchi’s work includes three principal contributions to statistics:

  • A specific loss functions
  • The philosophy of off-line quality control
  • Innovations in the design of experiments

The Taguchi method gauges quality as a calculation of loss to society associated with a product. In particular, loss in a product is defined by variations and deviations in its function as well as detrimental side effects that result from the product.

Loss from variation in function is a comparison of how much each unit of the product differs in the way it operates. The greater that variance, the more significant the loss in function and quality. This could be represented as a monetary figure denoting how usage has been impacted by defects in the product.

Taguchi’s use of loss functions

Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss functions. Reacting to Fisher’s methods in the design of experiments, Taguchi interpreted Fisher’s methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher’s work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests.

However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.

Such losses are, of course, very small when an item is near to negligible. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, unknown and unknowable, but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations:

  • Larger the better (for example, agricultural yield);
  • Smaller the better (for example, carbon dioxide emissions); and
  • On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons:

  • It is the first “symmetric” term in the Taylor series expansion of real analytic loss-functions.
  • Total loss is measured by the variance. For uncorrelated random variables, as variance is additive the total loss is an additive measurement of cost.
  • The squared-error loss function is widely used in statistics, following Gauss’s use of the squared-error loss function in justifying the method of least squares.

Taguchi’s rule for manufacturing

Taguchi realized that the best opportunity to eliminate variation of the final product quality is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:

  • System design
  • Parameter (measure) design
  • Tolerance design

System design

This is design at the conceptual level, involving creativity and innovation.

Parameter design

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi’s radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimize the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification.

Robust parameter designs consider controllable and uncontrollable noise variables; they seek to exploit relationships and optimize settings that minimize the effects of the noise variables.

Tolerance design

With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions.

Management of interactions

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for “control factors” or factors in the “inner array”. By combining an inner array of control factors with an outer array of “noise factors”, Taguchi’s approach provides “full information” on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim.

Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a “confirmation experiment” offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the “likelihood” of control factor-by-control factor interactions is greatly reduced, since “energy” is “additive”.

Inefficiencies of Taguchi’s designs

Interactions are part of the real world. In Taguchi’s arrays, interactions are confounded and difficult to resolve.

Statisticians in response surface methodology (RSM) advocate the “sequential assembly” of designs: In the RSM approach, a screening design is followed by a “follow-up design” that resolves only the confounded interactions judged worth resolution. A second follow-up design may be added (time and resources allowing) to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchi’s designs.

Assessment

Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on loss to society, techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been influential in improving manufactured quality worldwide.

Ishikawa Fish Bone, Applications in Organizations

Ishikawa diagrams (also called fishbone diagrams, herringbone diagrams, cause-and-effect diagrams, or Fishikawa) are causal diagrams created by Kaoru Ishikawa that show the potential causes of a specific event.

The fishbone diagram or Ishikawa diagram is a cause-and-effect diagram that helps managers to track down the reasons for imperfections, variations, defects, or failures.

The diagram looks just like a fish’s skeleton with the problem at its head and the causes for the problem feeding into the spine. Once all the causes that underlie the problem have been identified, managers can start looking for solutions to ensure that the problem doesn’t become a recurring one.

can also be used in product development. Having a problem-solving product will ensure that your new development will be popular provided people care about the problem you’re trying to solve. The fishbone diagram strives to pinpoint everything that’s wrong with current market offerings so that you can develop an innovation that doesn’t have these problems.

Common uses of the Ishikawa diagram are product design and quality defect prevention to identify potential factors causing an overall effect. Each cause or reason for imperfection is a source of variation. Causes are usually grouped into major categories to identify and classify these sources of variation.

Advantages

  • Highly visual brainstorming tool which can spark further examples of root causes
  • Quickly identify if the root cause is found multiple times in the same or different causal tree
  • Allows one to see all causes simultaneously
  • Good visualization for presenting issues to stakeholders

Disadvantages

  • Complex defects might yield a lot of causes which might become visually cluttering
  • Interrelationships between causes are not easily identifiable

Variation = Imperfection

When it comes to quality and efficiency, variation is your enemy. Whatever your business is, you don’t want to leave anything up to chance. From the moment your client contacts you, a predictable process should be followed with its aim being complete customer satisfaction. Variation in the process will mean variation in the product.

Fishbone diagrams help you to determine the variables that may enter the equation. They allow you to make your plans so that you know how to deal with them in such a way that the quality of your final product is still up to standard and without significant variation.

Applications

  • To analyze and find the root cause of a complicated problem
  • When there are many possible causes for a problem
  • If the traditional way of approaching the problem (trial and error, trying all possible causes, and so on) is very time consuming
  • The problem is very complicated and the project team cannot identify the root cause

Construct a Fishbone diagram

Here are the various tasks involved in constructing a Fishbone diagram:

  1. Define the problem
  2. Brainstorm
  3. Identify causes

Define the problem

The first step is fairly simple and straightforward. You have to define the problem for which the root cause has to be identified. Usually the project manager or technical architect, we will refer to this role as the leader throughout the rest of the article decides which problem to brainstorm. He has to choose the problems that are critical, that need a permanent fix, and that are worth brainstorming with the team. The leader can moderate the whole process.

After the problem is identified, the leader can start constructing the Fishbone diagram. Using a sheet of paper, she defines the problem in a square box to the right side of page. She draws a straight line from the left to the problem box with an arrow pointing towards the box. The problem box now becomes the fish head and its bones are to be laid in further steps. At the end of the first step, the Fishbone diagram looks like:

Brainstorm

People have difficulty understanding how to structure the thought process around a large problem domain. Sometimes it is useful to focus on logically related items of the problem domain and to represent them in the Fishbone diagram, which will convey the problem solving methodology. There are quite a few tools available that can help us in this regard, including:

  • Affinity Chart
    Organizes facts, opinions, ideas, and issues into a natural grouping. This grouping is in turn used as an aid in diagnosing complex problems.
  • Brainstorming
    Gathers ideas from people who are potential contributors. This process is discussed further in the following sections.
  • Check sheet
    Acts as a simple data recording device that helps to delineate important items and characteristics to direct attention to them and verify that they are evaluated.
  • Flow charts
    Organizes information about a process in a graphical manner and makes it clear who is impacted at every stage.

No single methodology is applicable to all problem domains. Based on experience and study, you can identify, thoroughly analyze, and maintain the methodology and the related problem domains. In the example given later in this article, we use brainstorming as the problem solving methodology.

Categorize

When you apply the Fishbone technique to business problems, the possible causes are usually classified into six categories:

  • Method
  • Man
  • Management
  • Measurement
  • Material
  • Machine

Though the above are a few important problem categories, during the brainstorming session, the team is encouraged to come up with all possible categories. The above categories give the team direction to help find the possible causes. Some of the categories listed above may or may not be applicable to software or to Domino in particular. Let’s look briefly at each category.

Category Description
Method Methods are ways of doing things or the procedures followed to accomplish a task. A typical cause under the Method category is not following instructions or the instructions are wrong.
Man People are responsible for the problem. The problem may have been caused by people who are inexperienced, who cannot answer prompted questions, and so on.
Management Management refers to project management; poor management decisions, such as upgrading two components simultaneously rather than deploying changes serially may cause technical problems.
Measurement Measurement refers to metrics that are derived from a project. Problems may occur if measurements are wrong or the measurement technique used is not relevant.
Material Material basically refers to a physical thing. A bad diskette is one typical example. Software can’t always handle errors caused by bad material, for instance a bad backup tape, so while material may be the least likely cause, it is a possible cause.
Machine A machine in software usually refers to the hardware, and there are a lot of possibilities that a problem can be due to the machine, such as performance issues.

Other possible categories include policies, procedure, technology, and so on.

After identifying a problem, the leader initiates a discussion with the project team to gather information about the possible causes, finally arriving at the root cause. The team can either analyze each of the above categories for possible causes or look into other categories (not listed above).

Identify causes

While brainstorming, the team should strive toward identifying major causes (categories) first, which can be further discussed, and then secondary causes for each major cause can be identified and discussed. This helps the team to concentrate on one major cause at a time and to refine further for possible secondary causes.

After the major causes (usually four to six) are identified, you can connect them as fishbones in the Fishbone diagram. They are represented as slanted lines with the arrow pointing towards the backbone of the fish. See Figure 2 later in this article.

Sometimes it is difficult to arrive at a few major causes. The team may come up with a lot of causes, which makes brainstorming more difficult. In this case, the leader can assign 10 points to each team member for each possible cause, and let them assign the rating (from 1 to 10, 10 being most likely cause) to each cause. After everyone on the team has rated the causes, the project manager totals each of the causes and ranks them based on their ratings. From the list, the top four to six causes are identified as major causes and connected as bones in the diagram.

The diagram looks a little like the skeleton of a fish, hence the name Fishbone. After the major causes of the problem are identified, each one of them is discussed in further detail with the team to find out the secondary causes. If needed, the secondary causes are further discussed to obtain the next level of possible causes. Each of the major causes is laid as a fishbone in the diagram and the secondary causes as “bonelets.”

The diagram now has a comprehensive list of possible causes for the problem, though the list may not be exhaustive or complete. However, the team has enough information to begin discussing the individual causes and to analyze their relevance to the problem. The team can use analytical, statistical, and graphical tools to assist in evaluating each of the causes. The Pareto principle (explained in part two of this article series) is also used to find the elements that cause major problems and to list them as major causes in the Fishbone diagram. Software metrics that are obtained during application support can also be used here for further assistance.

Evaluate, decide, and take action

It may be very difficult to come up with consensus on a large team for one possible root cause, but the majority is taken into consideration. Also, the major causes can be ranked in order with the most likely cause at the top of the list.

After the evaluation process is complete, the action plan has to be decided. If one possible root cause is identified, then the action plan has to be derived to rectify it. Sometimes, it may be difficult to arrive at the root cause; there may be a few possible root causes. In this case, the action plan has to be drawn for each of the possible root cause.

After the action plan is ready, the leader can designate an individual or team to work on the plan and to rectify the problem permanently. If there are a few possible root causes, all the action plans are to be executed, and the most likely root cause is identified and fixed.

Characteristics of Quality, Quality Assurance

Generally, it can be said that product is of satisfactory quality, if it satisfies the consumers/user. The consumer will buy a product or service only if it suits his requirements.

Therefore, consumers’ requirements are first assessed by marketing department and then the quality decision is taken on the basis of the information collected.

Eight dimensions of product quality management can be used at a strategic level to analyze quality characteristics. The concept was defined by David A. Garvin, formerly C. Roland Christensen Professor of Business Administration at Harvard Business School (died 30 April 2017). Garvin was posthumously honored with the prestigious award for ‘Outstanding Contribution to the Case Method’ on 4 March 2018.

Some of the dimensions are mutually reinforcing, whereas others are not improvement in one may be at the expense of others. Understanding the trade-offs desired by customers among these dimensions can help build a competitive advantage.

  • Performance: Performance refers to a product’s primary operating characteristics. This dimension of quality involves measurable attributes; brands can usually be ranked objectively on individual aspects of performance.
  • Features: Features are additional characteristics that enhance the appeal of the product or service to the user.
  • Reliability: Reliability is the likelihood that a product will not fail within a specific time period. This is a key element for users who need the product to work without fail.
  • Conformance: Conformance is the precision with which the product or service meets the specified standards.
  • Durability: Durability measures the length of a product’s life. When the product can be repaired, estimating durability is more complicated. The item will be used until it is no longer economical to operate it. This happens when the repair rate and the associated costs increase significantly.
  • Serviceability: Serviceability is the speed with which the product can be put into service when it breaks down, as well as the competence and the behavior of the service person.
  • Aesthetics: Aesthetics is the subjective dimension indicating the kind of response a user has to a product. It represents the individual’s personal preference.
  • Perceived Quality: Perceived Quality is the quality attributed to a good or service based on indirect measures.

Quality Characteristics:

An element which makes a product/item fit for use is the quality characteristics. The quality characteristics also mean a process by which the fitness for use can be translated into the technologists’ language for managing the quality. The quality characteristics are also classified into categories called ‘parameters’ of fitness for use.

Two such major parameters are known as:

(i) Quality of design

(ii) Quality of conformance

The quality of design is concerned with consumers’ satisfaction by variation in quality of products popularly called “grades”. In contrast the quality of conformance is the extent to which the products/ items and services conform to the intent of design.

The process capability, inspection and process control is involved in achieving this conformance so that product/goods produced meet the pre-decided specifications.

Quality Assurance

Quality assurance (QA) is a way of preventing mistakes and defects in manufactured products and avoiding problems when delivering products or services to customers; which ISO 9000 defines as “part of quality management focused on providing confidence that quality requirements will be fulfilled”. This defect prevention in quality assurance differs subtly from defect detection and rejection in quality control and has been referred to as a shift left since it focuses on quality earlier in the process (i.e., to the left of a linear process diagram reading left to right).

The terms “quality assurance” and “quality control” are often used interchangeably to refer to ways of ensuring the quality of a service or product. For instance, the term “assurance” is often used as follows: Implementation of inspection and structured testing as a measure of quality assurance in a television set software project at Philips Semiconductors is described. The term “control”, however, is used to describe the fifth phase of the Define, Measure, Analyze, Improve, Control (DMAIC) model. DMAIC is a data-driven quality strategy used to improve processes.

Quality assurance comprises administrative and procedural activities implemented in a quality system so that requirements and goals for a product, service or activity will be fulfilled. It is the systematic measurement, comparison with a standard, monitoring of processes and an associated feedback loop that confers error prevention. This can be contrasted with quality control, which is focused on process output.

Quality assurance includes two principles: “Fit for purpose” (the product should be suitable for the intended purpose); and “right first time” (mistakes should be eliminated). QA includes management of the quality of raw materials, assemblies, products and components, services related to production, and management, production and inspection processes. The two principles also manifest before the background of developing (engineering) a novel technical product: The task of engineering is to make it work once, while the task of quality assurance is to make it work all the time.

Quality Assurance Process steps:

  • Plan: Organization should plan and establish the process related objectives and determine the processes that are required to deliver a high-Quality end product.
  • Do: Development and testing of Processes and also “do” changes in the processes
  • Check: Monitoring of processes, modify the processes, and check whether it meets the predetermined objectives
  • Act: A Quality Assurance tester should implement actions that are necessary to achieve improvements in the processes

Concepts of Productivity, Modes of calculating productivity

Productivity is an overall measure of the ability to produce a good or service. More specifically, productivity is the measure of how specified resources are managed to accomplish timely objectives as stated in terms of quantity and quality. Productivity may also be defined as an index that measures output (goods and services) relative to the input (labor, materials, energy, etc., used to produce the output). As such, it can be expressed as:

Hence, there are two major ways to increase productivity: increase the numerator (output) or decrease the denominator (input). Of course, a similar effect would be seen if both input and output increased, but output increased faster than input; or if input and output decreased, but input decreased faster than output.

Organizations have many options for use of this formula, labor productivity, machine productivity, capital productivity, energy productivity, and so on. A productivity ratio may be computed for a single operation, a department, a facility, an organization, or even an entire country.

Productivity is an objective concept. As an objective concept it can be measured, ideally against a universal standard. As such, organizations can monitor productivity for strategic reasons such as corporate planning, organization improvement, or comparison to competitors. It can also be used for tactical reasons such as project control or controlling performance to budget.

Productivity is also a scientific concept, and hence can be logically defined and empirically observed. It can also be measured in quantitative terms, which qualifies it as a variable. Therefore, it can be defined and measured in absolute or relative terms. However, an absolute definition of productivity is not very useful; it is much more useful as a concept dealing with relative productivity or as a productivity factor.

Productivity is useful as a relative measure of actual output of production compared to the actual input of resources, measured across time or against common entities. As output increases for a level of input, or as the amount of input decreases for a constant level of output, an increase in productivity occurs. Therefore, a “productivity measure” describes how well the resources of an organization are being used to produce input.

Productivity is often confused with efficiency. Efficiency is generally seen as the ratio of the time needed to perform a task to some predetermined standard time. However, doing unnecessary work efficiently is not exactly being productive. It would be more correct to interpret productivity as a measure of effectiveness (doing the right thing efficiently), which is outcome-oriented rather than output-oriented.

Productivity is usually expressed in one of three forms: partial factor productivity, multifactor productivity, and total productivity.

Productivity refers to the physical relationship between the quantity produced (output) and the quantity of resources used in the course of production (input).

“It is the ratio between the output of goods and services and the input of resources consumed in the process of production.”

Productivity is the ratio between output of wealth and input of resources used in production processes.

Productivity = Measure of output / Measure of Input

Total Productivity:

Pt = Qt / (L+C+R+M)

where Pt: Total productivity

L = Labour input

C = Capital input

R = Raw material and purchased parts input

M = Other miscellaneous goods and services input factors

Qt = Total output

Productivity can be increased by:

  • Generating more outputs from same level of inputs.
  • Producing same level of outputs with reduced level of inputs.
  • A combination of both.

Importance of Productivity:

The concept of productivity is of great significance for undeveloped and developing countries. In both the cases there are limited resources that should be used to get the maximum output i.e. there should be tendency to perform a job by cheaper, safer and quicker ways.

The aim should be optimum use of resource so as to provide maximum satisfaction with minimum efforts and expenditure. Productivity analysis and measures indicate the stages and situations where improvement in the working of inputs is possible to increase the output.

The productivity indicators can be used for different purposes viz. comparison of performances for various organizations, contribution of different input factors, bargaining with trade unions etc.

Factors Affecting:

Productivity is the outcome of several factors. These factors are so interrelated that it is difficult to identify the effect of any one factor on productivity.

These factors may broadly be divided as follows:

  1. Human:

Human nature and human behaviour are the most significant determinants of productivity.

Human factors may further be classified into two categories as given below:

(a) Ability to work: Productivity of an organization depends upon the competence and calibre of its people both workers and managers. Ability to work is governed by education, training, experience, aptitude, etc. of the employees.

(b) Willingness to work: Motivation and morale of people is the second important group of human factors that determine productivity. Wage incentive schemes, labour participation in management, communication system, informal group relations, promotion policy, union management relations, quality of leadership, etc., are the main factors governing employees’ willingness to work. Working conditions like working hours, sanitation, ventilation, schools, clubs, libraries, subsidized canteen, company transport, etc., also influence the motivation and morale of employees.

  1. Technological:

Technological factors exercise significant influence on the level of productivity.

(a) Size and capacity of plant

(b) Product design and standardization

(c) Timely supply of materials and fuel

(d) Rationalization and automation measures

(e) Repairs and maintenance

(f) Production planning and control

(g) Plant layout and location

(h) Materials handling system

(i) Inspection and quality control

(j) Machinery and equipment used

(k) Research and development

(l) Inventory control

(m) Reduction and utilization of waste and scrap, etc.

  1. Managerial:

The competence and attitudes of managers have an important bearing on productivity. In many organizations, productivity is low despite latest technology and trained manpower. This is due to inefficient and indifferent management. Competent and dedicated managers can obtain extraordinary results from ordinary people.

Job performance of employees depends on their ability and willingness to work. Management is the catalyst to create both. Advanced technology requires knowledge workers who in turn work productively under professionally qualified managers. No ideology can win a greater output with less effort. It is only through sound management that optimum utilization of human and technical resources can be secured.

  1. Natural:

Natural factors such as physical, geological, geographical and climatic conditions exert considerable influence on productivity, particularly in extractive industries. For example, productivity of labour in extreme climates (too cold or too hot) tends to be comparatively low. Natural resources like water, fuel and minerals influence productivity.

  1. Sociological:

Social customs, traditions and institutions influence attitudes towards work and job. For instance, bias on the basis of caste, religion, etc., inhibited the growth of modern industry in some countries. The joint family system affected incentive to work hard in India. Close ties with land and native place hampered stability and discipline among industrial labour.

  1. Political:

Law and order, stability of Government, harmony between States, etc. are essential for high productivity in industries. Taxation policies of the Government influence willingness to work, capital formation, modernization and expansion of plants, etc. Industrial policy affects the size, and capacity of plants. Tariff policies influence competition. Elimination of sick and inefficient units helps to improve productivity.

  1. Economic:

Size of the market, banking and credit facilities, transport and communication systems, etc. are important factors influencing productivity.

Productivity is an economics term which refers to the ratio of product to what is required to produce the product. Productivity is outcome of several interrelated factors. All the factors which are related to input and output components of a production process are likely to affect productivity.

So, there are many factors which can influence productivity; such as internal and external. Knowing the internal and external factors that affect productivity of an Industrial organization; give industrial engineers; the intelligence, they needs to sort out the low performance of resources and make strategic plans for the future.

The best thing about internal factors is that you can control many of them. External factors are all those things that are beyond your control. To deal with all these factors we need different people and variety of techniques and methods.

Edward Deming Quality Philosophies

Dr. William Edwards Deming (October 14, 1900 – December 20, 1993) was an American engineer, statistician, professor, author, lecturer, and management consultant. Educated initially as an electrical engineer and later specializing in mathematical physics, he helped develop the sampling techniques still used by the U.S. Department of the Census and the Bureau of Labor Statistics.

In his book The New Economics for Industry, Government, and Education Deming championed the work of Walter Shewhart, including statistical process control, operational definitions, and what Deming called the “Shewhart Cycle,” which had evolved into Plan-Do-check-Act (PDCA). Deming is best known for his work in Japan after WWII, particularly his work with the leaders of Japanese industry. That work began in July and August 1950, in Tokyo and at the Hakone Convention Center, when Deming delivered speeches on what he called “Statistical Product Quality Administration”. Many in Japan credit Deming as one of the inspirations for what has become known as the Japanese post-war economic miracle of 1950 to 1960, when Japan rose from the ashes of war on the road to becoming the second-largest economy in the world through processes partially influenced by the ideas Deming taught:

  • Better design of products to improve service
  • Higher level of uniform product quality
  • Improvement of product testing in the workplace and in research centers
  • Greater sales through side [global] markets

Create a constant purpose toward improvement.

  • Plan for quality in the long term.
  • Resist reacting with short-term solutions.
  • Don’t just do the same things better find better things to do.
  • Predict and prepare for future challenges, and always have the goal of getting better.

Adopt the new philosophy.

  • Embrace quality throughout the organization.
  • Put your customers’ needs first, rather than react to competitive pressure and design products and services to meet those needs.
  • Be prepared for a major change in the way business is done. It’s about leading, not simply managing.
  • Create your quality vision, and implement it.

Stop depending on inspections

  • Inspections are costly and unreliable and they don’t improve quality, they merely find a lack of quality.
  • Build quality into the process from start to finish.
  • Don’t just find what you did wrong eliminate the “wrongs” altogether.
  • Use statistical control methods not physical inspections alone to prove that the process is working.

Use a single supplier for any one item.

  • Quality relies on consistency: The less variation you have in the input, the less variation you’ll have in the output.
  • Look at suppliers as your partners in quality. Encourage them to spend time improving their own quality, they shouldn’t compete for your business based on price alone.
  • Analyze the total cost to you, not just the initial cost of the product.
  • Use quality statistics to ensure that suppliers meet your quality standards.

Improve constantly and forever.

  • Continuously improve your systems and processes. Deming promoted the Plan-Do-Check-Act
  • approach to process analysis and improvement.
  • Emphasize training and education so everyone can do their jobs better.
  • Use kaizen
  • as a model to reduce waste and to improve productivity, effectiveness, and safety.

Use training on the job.

  • Train for consistency to help reduce variation.
  • Build a foundation of common knowledge.
  • Allow workers to understand their roles in the “big picture.”
  • Encourage staff to learn from one another, and provide a culture and environment for effective teamwork.

Implement leadership

  • Expect your supervisors and managers to understand their workers and the processes they use.
  • Don’t simply supervise provide support and resources so that each staff member can do his or her best. Be a coach instead of a policeman.
  • Figure out what each person actually needs to do his or her best.
  • Emphasize the importance of participative management and transformational leadership.
  • Find ways to reach full potential, and don’t just focus on meeting targets and quotas.

Eliminate fear

  • Allow people to perform at their best by ensuring that they’re not afraid to express ideas or concerns.
  • Let everyone know that the goal is to achieve high quality by doing more things right and that you’re not interested in blaming people when mistakes happen.
  • Make workers feel valued, and encourage them to look for better ways to do things.
  • Ensure that your leaders are approachable and that they work with teams to act in the company’s best interests.
  • Use open and honest communication to remove fear from the organization.

Break down barriers between departments.

  • Build the “internal customer” concept, recognize that each department or function serves other departments that use their output.
  • Build a shared vision.
  • Use cross-functional teamwork to build understanding and reduce adversarial relationships.
  • Focus on collaboration and consensus instead of compromise.

Get rid of unclear slogans.

  • Let people know exactly what you want don’t make them guess. “Excellence in service” is short and memorable, but what does it mean? How is it achieved? The message is clearer in a slogan like “You can do better if you try.”
  • Don’t let words and nice-sounding phrases replace effective leadership. Outline your expectations, and then praise people face-to-face for doing good work.

Eliminate management by objectives

  • Look at how the process is carried out, not just numerical targets. Deming said that production targets encourage high output and low quality.
  • Provide support and resources so that production levels and quality are high and achievable.
  • Measure the process rather than the people behind the process.

Remove barriers to pride of workmanship.

  • Allow everyone to take pride in their work without being rated or compared.
  • Treat workers the same, and don’t make them compete with other workers for monetary or other rewards. Over time, the quality system will naturally raise the level of everyone’s work to an equally high level.

Implement education and self-improvement.

  • Improve the current skills of workers.
  • Encourage people to learn new skills to prepare for future changes and challenges.
  • Build skills to make your workforce more adaptable to change, and better able to find and achieve improvements.

Make “transformation” everyone’s job.

  • Improve your overall organization by having each person take a step toward quality.
  • Analyze each small step, and understand how it fits into the larger picture.
  • Use effective change management principles to introduce the new philosophy and ideas in Deming’s 14 points.

J Juran Quality Philosophies

Joseph Moses Juran (December 24, 1904 – February 28, 2008) was a Romanian-American engineer and management consultant. He was an evangelist for quality and quality management, having written several books on those subjects.

Pareto principle

In 1941, Juran stumbled across the work of Vilfredo Pareto and began to apply the Pareto principle to quality issues (for example, 80% of a problem is caused by 20% of the causes). This is also known as “the vital few and the trivial many.” In later years, Juran preferred “the vital few and the useful many” to signal that the remaining 80% of the causes should not be totally ignored.

For example, he argued that most defects are the result of a small percentage of the causes of all defects, according to the Economist. For another, 20% of a team’s members are going to make up 80% of a project’s successful results. And 20% of a businesses’ customers will create 80% of the profit.

Juran felt organizations, armed with that knowledge, would focus less on meaningless minutiae and more on identifying the 20%. That means eliminating the 20% of mistakes causing the majority of defects, rewarding the 20% of employees causing 80% of the success and serving the 20% of loyal customers that drive sales. In a way, Pareto’s Principle puts numbers to the idea that in business, as in life, things are not evenly distributed. Pareto was studying land ownership in Italy. But Juran saw that it applied to business, as well.

Juran Trilogy

“Goal setting has traditionally been based on past performance. This practice has tended to perpetuate the sins of the past.”

In his focus on people and how they work in processes, Juran took a different approach than others working in the growing quality improvement field. In doing so, he completely changed how companies looked at reducing inefficiencies.

Juran found the hidden costs in how companies tended to deal with defects. In the early 20th century, that often meant dealing with the issue after it had occurred rather than focusing time and money on making quality improvements to keep defects from happening.

He developed the Juran Trilogy, which involved three principal areas:

Quality planning: This involves identifying your customers, determining their needs and developing products that respond to their needs.

Quality improvement: Develop a process to create the product and then optimize that process.

Quality control: Create a process that can operate under minimal inspection.

Quality Planning

Quality Planning is the activity of developing the products and processes required to meet customer’s needs. It involves:

  • Establish quality goals
  • Identify the customers- those who will be impacted by the efforts to meet the goal.
  • Determine the customers’ needs
  • Develop product features that respond to customers’ needs
  • Develop processes that can produce those product features
  • Establish process controls, and transfer the resulting plans to the operating forces

Quality Improvement

This process is the means of raising quality performance to unprecedented levels (breakthrough). This involves:

  • Establish the quality improvement infrastructure
  • Identify the improvement projects
  • For each project establish a project team with clear responsibility
  • Provide the resource, motivation, and training needed by the team

Quality Control

This process consists of the following steps:

  • Evaluate actual quality performance
  • Compare actual performance to quality goals
  • Act on the difference

Quality Improvement 10 Steps proposal

  • Build awareness of the need and opportunity to improve
  • Set goals for that improvement
  • Create plans to reach the goals
  • Provide training
  • Conduct projects to solve problems
  • Report on progress
  • Give recognition for success
  • Communicate results
  • Keep score
  • Maintain momentum

P Crosby’s Quality Philosophies

Philip Bayard “Phil” Crosby, (June 18, 1926 – August 18, 2001) was a businessman and author who contributed to management theory and quality management practices.

Crosby initiated the Zero Defects program at the Martin Company. As the quality control manager of the Pershing missile program, Crosby was credited with a 25 percent reduction in the overall rejection rate and a 30 percent reduction in scrap costs.

The Absolutes of Quality Management

Crosby defined Four Absolutes of Quality Management, which are

  • The First Absolute: The definition of quality is conformance to requirements
  • The Next Absolute: The system of quality is prevention
  • The Third Absolute: The performance standard is zero defects
  • The Final Absolute: The measurement of quality is the price of non-conformance

Fourteen Steps to Quality Improvement

  1. Management Commitment

Make it clear that management is committed to quality.

  1. Quality Improvement Teams

Form Quality Improvement Teams with senior representatives from each department.

  1. Measure Processes

Measure processes to determine where current and potential quality problems lie.

  1. Cost of Quality

Evaluate the cost of quality and explain its use as a management tool.

  1. Quality Awareness

Raise the quality awareness and personal concern of all employees.

  1. Correct Problems

Take actions to correct problems identified through previous steps.

  1. Monitor Progress

Establish progress monitoring for the improvement process.

  1. Train Supervisors

Train supervisors to actively carry out their part of the quality improvement program.

  1. Zero Defects Day

Hold a Zero Defects Day to reaffirm management commitment.

  1. Establish Improvement Goals

Encourage individuals to establish improvement goals for themselves and their group.

  1. Remove Fear

Encourage employees to tell management about obstacles to improving quality.

  1. Recognize

Recognize and appreciate those who participate.

  1. Quality Councils

Establish Quality Councils to communicate on a regular basis.

  1. Repeat the Cycle

Do it all over again to emphasize that the quality improvement process never ends.

Zero Defects

Crosby’s Zero Defects is a performance method and standard that states that people should commit themselves too closely monitoring details and avoid errors. By doing this, they move closer to the zero defects goal. According to Crosby, zero defects was not just a manufacturing principle but was an all-pervading philosophy that ought to influence every decision that we make. Managerial notions of defects being unacceptable and everyone doing ‘things right the first time’ are reinforced.

The Quality Vaccine

Crosby explained that this vaccination was the medicine for organizations to prevent poor quality.

  • Integrity: Quality must be taken seriously throughout the entire organization, from the highest levels to the lowest. The company’s future will be judged by the quality it delivers.
  • Systems: The right measures and systems are necessary for quality costs, performance, education, improvement, review, and customer satisfaction.
  • Communication: Communication is a very important factor in an organization. It is required to communicate the specifications, requirements, and improvement opportunities of the organization. Listening to customers and operatives intently and incorporating feedback will give the organization an edge over the competition.
  • Operations: a culture of improvement should be the norm in any organization, and the process should be solid.
  • Policies: policies that are implemented should be consistent and clear throughout the organization.
error: Content is protected !!