Sigma features, Enablers, Goals, DMAIC/DMADV

Sigma features

Inventory management plays two critical roles in Lean Six Sigma. Firstly, the management of raw materials and semi-finished goods in the lean manufacturing process. Secondly, inventory control of finished goods held in a warehouse by manufacturers. In both cases, you should keep a sufficient stock level to minimize inventory carrying costs.

As a component of lean manufacturing, stock levels of raw materials and semi-finished products need to be controlled and managed. Similar to Japanese kaizen and Just-in-Time principles, just the right amount of stock should be held to keep production schedules going. This will reduce waste and optimize the production workflow.

For finished goods, excess and obsolete inventory are major business issues. These often involve deadstock and seasonality. A typical solution addresses the excess inventory. You might sell them below cost or donate them, for example. But this does not address their root cause.

  • Identify the value that a company will get from Lean Inventory Management.
  • Optimize the flow of inventory through the business by removing obstacles in the way. This comes from the Japanese 5S Lean principle (Sort, Straighten, Sweep, Standardize, Sustain).
  • Move inventory only when requested by the customer. This is adapted from the Kanban Lean principle.
  • Be flexible and adapt to change. This is influenced by the Kaizen Lean principle.
  • Continuously refine your inventory management processes to improve quality, cycle time, efficiency and cost. This is derived from the DMAIC Six Sigma methodology.

Principles

  • Demand management. You should only move inventory upon an order by a customer.
  • Cost and waste reduction. But not to the extent of negatively affecting the customer.
  • Process standardization. Standardizing, for example, on transportation and business processes.
  • Industry standardization. Standardizing on product parts and components.
  • Cultural change. Everyone along the supply chain must work as a team. Similarly, this echoes principles from Just-in-Time manufacturing.
  • Cross-organization collaboration. Teams that cut across the organization can help to understand value better. Thus, it provides a holistic view similar to that of customer.

Enablers Goals

DMAIC/DMADV

DMAIC

Define: the problem, goal, reason the issue needs to be resolved.

Measure: the current state as a baseline and use it as a starting point for improvement.

Analyze: the root cause, identify with data driven tools and validate as to why said issue is happening.

Improve: here you need to identify some creative solutions to get rid of the major root causes, so the problem will be fixed and prevent future similar issues.

Control: here you want to maintain the improvements and sustain the success of those new improvements.

DMADV

DMADV is the acronym for the framework of Design for Six Sigma (DFSS), which is used when developing a brand-new service or product within a business. The acronym stands for Define, Measurement, Analysis, Design, and Verify.

Define: The goal of the new product or service, set realistic and measurable goals, why it is needed.

Measurement: You must know which factors that are critically important; this should include any parameters, including risks, as well as the production process and product capability.

Analysis: Here you develop design alternatives, work with different combinations and outcomes, and select the best components that would work.

Design: This is where a detailed prototype is developed. After this is done, a more detailed version is developed where errors may make it necessary to modify the current version.

Verify: Here is the final step where the newly designed product is taken to the real world test to see if it will work perfectly. Many production runs might be necessary to see if the quality is the absolute highest it can be.

DMAIC and DMADV do have a number of similarities that are worth noting. They both use statistical tools and facts in order find solutions to common quality-related problems and focus on reaching the business and financial goals of an organization. DMAIC and DMADV are implemented by Green Belts, Black Belts and Master Black Belts and are used to reduce defects to fewer than 3.4 per million available opportunities, or Six Sigma. Their solutions are data intensive and based only on hard facts.

The two most widely used Six Sigma methodologies are DMAIC and DMADV.  Both methods are designed so a business process will be more efficient and effective. While both of these methodologies share some important characteristics, they are not interchangeable and were developed for use in differing business processes.  Before comparing these two approaches in more detail, let’s review what the acronyms stand for.

  • DMAIC: Define, Measure, Analyze, Improve, Control
  • DMADV: Define, Measure, Analyze, Design, Verify

Despite the shared first three letters of their names, there are some notable differences between them. The main difference exists in the way the final two steps of the process are handled. With DMADV, the Design and Verify steps deal with redesigning a process to match customer needs, as opposed to the Improve and Control steps that focus on determining ways to readjust and control the process. DMAIC typically defines a business process and how applicable it is; DMADV defines the needs of the customer as they relate to a service or product.

With regards to measurement, DMAIC measures current performance of a process while DMADV measures customer specifications and needs. Control systems are established with DMAIC in order to keep check on the business’ future performance, while with DMADV, a suggested business model must undergo simulation tests to verify efficacy.

DMAIC concentrates on making improvements to a business process in order to reduce or eliminate defects; DMADV develops an appropriate business model destined to meet the customers’ requirements.

TAGUCHI’s Quality Engineering

The Taguchi method of quality control is an approach to engineering that emphasizes the roles of research and development (R&D), product design and development in reducing the occurrence of defects and failures in manufactured goods.

This method, developed by Japanese engineer and statistician Genichi Taguchi, considers design to be more important than the manufacturing process in quality control, aiming to eliminate variances in production before they can occur.

Taguchi methods are statistical methods, sometimes called robust design methods, developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to engineering, biotechnology, marketing and advertising. Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchi’s development of designs for studying variation, but have criticized the inefficiency of some of Taguchi’s proposals.

Taguchi’s work includes three principal contributions to statistics:

  • A specific loss functions
  • The philosophy of off-line quality control
  • Innovations in the design of experiments

The Taguchi method gauges quality as a calculation of loss to society associated with a product. In particular, loss in a product is defined by variations and deviations in its function as well as detrimental side effects that result from the product.

Loss from variation in function is a comparison of how much each unit of the product differs in the way it operates. The greater that variance, the more significant the loss in function and quality. This could be represented as a monetary figure denoting how usage has been impacted by defects in the product.

Taguchi’s use of loss functions

Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss functions. Reacting to Fisher’s methods in the design of experiments, Taguchi interpreted Fisher’s methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher’s work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests.

However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.

Such losses are, of course, very small when an item is near to negligible. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, unknown and unknowable, but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations:

  • Larger the better (for example, agricultural yield);
  • Smaller the better (for example, carbon dioxide emissions); and
  • On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons:

  • It is the first “symmetric” term in the Taylor series expansion of real analytic loss-functions.
  • Total loss is measured by the variance. For uncorrelated random variables, as variance is additive the total loss is an additive measurement of cost.
  • The squared-error loss function is widely used in statistics, following Gauss’s use of the squared-error loss function in justifying the method of least squares.

Taguchi’s rule for manufacturing

Taguchi realized that the best opportunity to eliminate variation of the final product quality is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:

  • System design
  • Parameter (measure) design
  • Tolerance design

System design

This is design at the conceptual level, involving creativity and innovation.

Parameter design

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi’s radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimize the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification.

Robust parameter designs consider controllable and uncontrollable noise variables; they seek to exploit relationships and optimize settings that minimize the effects of the noise variables.

Tolerance design

With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions.

Management of interactions

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for “control factors” or factors in the “inner array”. By combining an inner array of control factors with an outer array of “noise factors”, Taguchi’s approach provides “full information” on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim.

Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a “confirmation experiment” offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the “likelihood” of control factor-by-control factor interactions is greatly reduced, since “energy” is “additive”.

Inefficiencies of Taguchi’s designs

Interactions are part of the real world. In Taguchi’s arrays, interactions are confounded and difficult to resolve.

Statisticians in response surface methodology (RSM) advocate the “sequential assembly” of designs: In the RSM approach, a screening design is followed by a “follow-up design” that resolves only the confounded interactions judged worth resolution. A second follow-up design may be added (time and resources allowing) to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchi’s designs.

Assessment

Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on loss to society, techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been influential in improving manufactured quality worldwide.

Ishikawa Fish Bone, Applications in Organizations

Ishikawa diagrams (also called fishbone diagrams, herringbone diagrams, cause-and-effect diagrams, or Fishikawa) are causal diagrams created by Kaoru Ishikawa that show the potential causes of a specific event.

The fishbone diagram or Ishikawa diagram is a cause-and-effect diagram that helps managers to track down the reasons for imperfections, variations, defects, or failures.

The diagram looks just like a fish’s skeleton with the problem at its head and the causes for the problem feeding into the spine. Once all the causes that underlie the problem have been identified, managers can start looking for solutions to ensure that the problem doesn’t become a recurring one.

can also be used in product development. Having a problem-solving product will ensure that your new development will be popular provided people care about the problem you’re trying to solve. The fishbone diagram strives to pinpoint everything that’s wrong with current market offerings so that you can develop an innovation that doesn’t have these problems.

Common uses of the Ishikawa diagram are product design and quality defect prevention to identify potential factors causing an overall effect. Each cause or reason for imperfection is a source of variation. Causes are usually grouped into major categories to identify and classify these sources of variation.

Advantages

  • Highly visual brainstorming tool which can spark further examples of root causes
  • Quickly identify if the root cause is found multiple times in the same or different causal tree
  • Allows one to see all causes simultaneously
  • Good visualization for presenting issues to stakeholders

Disadvantages

  • Complex defects might yield a lot of causes which might become visually cluttering
  • Interrelationships between causes are not easily identifiable

Variation = Imperfection

When it comes to quality and efficiency, variation is your enemy. Whatever your business is, you don’t want to leave anything up to chance. From the moment your client contacts you, a predictable process should be followed with its aim being complete customer satisfaction. Variation in the process will mean variation in the product.

Fishbone diagrams help you to determine the variables that may enter the equation. They allow you to make your plans so that you know how to deal with them in such a way that the quality of your final product is still up to standard and without significant variation.

Applications

  • To analyze and find the root cause of a complicated problem
  • When there are many possible causes for a problem
  • If the traditional way of approaching the problem (trial and error, trying all possible causes, and so on) is very time consuming
  • The problem is very complicated and the project team cannot identify the root cause

Construct a Fishbone diagram

Here are the various tasks involved in constructing a Fishbone diagram:

  1. Define the problem
  2. Brainstorm
  3. Identify causes

Define the problem

The first step is fairly simple and straightforward. You have to define the problem for which the root cause has to be identified. Usually the project manager or technical architect, we will refer to this role as the leader throughout the rest of the article decides which problem to brainstorm. He has to choose the problems that are critical, that need a permanent fix, and that are worth brainstorming with the team. The leader can moderate the whole process.

After the problem is identified, the leader can start constructing the Fishbone diagram. Using a sheet of paper, she defines the problem in a square box to the right side of page. She draws a straight line from the left to the problem box with an arrow pointing towards the box. The problem box now becomes the fish head and its bones are to be laid in further steps. At the end of the first step, the Fishbone diagram looks like:

Brainstorm

People have difficulty understanding how to structure the thought process around a large problem domain. Sometimes it is useful to focus on logically related items of the problem domain and to represent them in the Fishbone diagram, which will convey the problem solving methodology. There are quite a few tools available that can help us in this regard, including:

  • Affinity Chart
    Organizes facts, opinions, ideas, and issues into a natural grouping. This grouping is in turn used as an aid in diagnosing complex problems.
  • Brainstorming
    Gathers ideas from people who are potential contributors. This process is discussed further in the following sections.
  • Check sheet
    Acts as a simple data recording device that helps to delineate important items and characteristics to direct attention to them and verify that they are evaluated.
  • Flow charts
    Organizes information about a process in a graphical manner and makes it clear who is impacted at every stage.

No single methodology is applicable to all problem domains. Based on experience and study, you can identify, thoroughly analyze, and maintain the methodology and the related problem domains. In the example given later in this article, we use brainstorming as the problem solving methodology.

Categorize

When you apply the Fishbone technique to business problems, the possible causes are usually classified into six categories:

  • Method
  • Man
  • Management
  • Measurement
  • Material
  • Machine

Though the above are a few important problem categories, during the brainstorming session, the team is encouraged to come up with all possible categories. The above categories give the team direction to help find the possible causes. Some of the categories listed above may or may not be applicable to software or to Domino in particular. Let’s look briefly at each category.

Category Description
Method Methods are ways of doing things or the procedures followed to accomplish a task. A typical cause under the Method category is not following instructions or the instructions are wrong.
Man People are responsible for the problem. The problem may have been caused by people who are inexperienced, who cannot answer prompted questions, and so on.
Management Management refers to project management; poor management decisions, such as upgrading two components simultaneously rather than deploying changes serially may cause technical problems.
Measurement Measurement refers to metrics that are derived from a project. Problems may occur if measurements are wrong or the measurement technique used is not relevant.
Material Material basically refers to a physical thing. A bad diskette is one typical example. Software can’t always handle errors caused by bad material, for instance a bad backup tape, so while material may be the least likely cause, it is a possible cause.
Machine A machine in software usually refers to the hardware, and there are a lot of possibilities that a problem can be due to the machine, such as performance issues.

Other possible categories include policies, procedure, technology, and so on.

After identifying a problem, the leader initiates a discussion with the project team to gather information about the possible causes, finally arriving at the root cause. The team can either analyze each of the above categories for possible causes or look into other categories (not listed above).

Identify causes

While brainstorming, the team should strive toward identifying major causes (categories) first, which can be further discussed, and then secondary causes for each major cause can be identified and discussed. This helps the team to concentrate on one major cause at a time and to refine further for possible secondary causes.

After the major causes (usually four to six) are identified, you can connect them as fishbones in the Fishbone diagram. They are represented as slanted lines with the arrow pointing towards the backbone of the fish. See Figure 2 later in this article.

Sometimes it is difficult to arrive at a few major causes. The team may come up with a lot of causes, which makes brainstorming more difficult. In this case, the leader can assign 10 points to each team member for each possible cause, and let them assign the rating (from 1 to 10, 10 being most likely cause) to each cause. After everyone on the team has rated the causes, the project manager totals each of the causes and ranks them based on their ratings. From the list, the top four to six causes are identified as major causes and connected as bones in the diagram.

The diagram looks a little like the skeleton of a fish, hence the name Fishbone. After the major causes of the problem are identified, each one of them is discussed in further detail with the team to find out the secondary causes. If needed, the secondary causes are further discussed to obtain the next level of possible causes. Each of the major causes is laid as a fishbone in the diagram and the secondary causes as “bonelets.”

The diagram now has a comprehensive list of possible causes for the problem, though the list may not be exhaustive or complete. However, the team has enough information to begin discussing the individual causes and to analyze their relevance to the problem. The team can use analytical, statistical, and graphical tools to assist in evaluating each of the causes. The Pareto principle (explained in part two of this article series) is also used to find the elements that cause major problems and to list them as major causes in the Fishbone diagram. Software metrics that are obtained during application support can also be used here for further assistance.

Evaluate, decide, and take action

It may be very difficult to come up with consensus on a large team for one possible root cause, but the majority is taken into consideration. Also, the major causes can be ranked in order with the most likely cause at the top of the list.

After the evaluation process is complete, the action plan has to be decided. If one possible root cause is identified, then the action plan has to be derived to rectify it. Sometimes, it may be difficult to arrive at the root cause; there may be a few possible root causes. In this case, the action plan has to be drawn for each of the possible root cause.

After the action plan is ready, the leader can designate an individual or team to work on the plan and to rectify the problem permanently. If there are a few possible root causes, all the action plans are to be executed, and the most likely root cause is identified and fixed.

Characteristics of Quality, Quality Assurance

Generally, it can be said that product is of satisfactory quality, if it satisfies the consumers/user. The consumer will buy a product or service only if it suits his requirements.

Therefore, consumers’ requirements are first assessed by marketing department and then the quality decision is taken on the basis of the information collected.

Eight dimensions of product quality management can be used at a strategic level to analyze quality characteristics. The concept was defined by David A. Garvin, formerly C. Roland Christensen Professor of Business Administration at Harvard Business School (died 30 April 2017). Garvin was posthumously honored with the prestigious award for ‘Outstanding Contribution to the Case Method’ on 4 March 2018.

Some of the dimensions are mutually reinforcing, whereas others are not improvement in one may be at the expense of others. Understanding the trade-offs desired by customers among these dimensions can help build a competitive advantage.

  • Performance: Performance refers to a product’s primary operating characteristics. This dimension of quality involves measurable attributes; brands can usually be ranked objectively on individual aspects of performance.
  • Features: Features are additional characteristics that enhance the appeal of the product or service to the user.
  • Reliability: Reliability is the likelihood that a product will not fail within a specific time period. This is a key element for users who need the product to work without fail.
  • Conformance: Conformance is the precision with which the product or service meets the specified standards.
  • Durability: Durability measures the length of a product’s life. When the product can be repaired, estimating durability is more complicated. The item will be used until it is no longer economical to operate it. This happens when the repair rate and the associated costs increase significantly.
  • Serviceability: Serviceability is the speed with which the product can be put into service when it breaks down, as well as the competence and the behavior of the service person.
  • Aesthetics: Aesthetics is the subjective dimension indicating the kind of response a user has to a product. It represents the individual’s personal preference.
  • Perceived Quality: Perceived Quality is the quality attributed to a good or service based on indirect measures.

Quality Characteristics:

An element which makes a product/item fit for use is the quality characteristics. The quality characteristics also mean a process by which the fitness for use can be translated into the technologists’ language for managing the quality. The quality characteristics are also classified into categories called ‘parameters’ of fitness for use.

Two such major parameters are known as:

(i) Quality of design

(ii) Quality of conformance

The quality of design is concerned with consumers’ satisfaction by variation in quality of products popularly called “grades”. In contrast the quality of conformance is the extent to which the products/ items and services conform to the intent of design.

The process capability, inspection and process control is involved in achieving this conformance so that product/goods produced meet the pre-decided specifications.

Quality Assurance

Quality assurance (QA) is a way of preventing mistakes and defects in manufactured products and avoiding problems when delivering products or services to customers; which ISO 9000 defines as “part of quality management focused on providing confidence that quality requirements will be fulfilled”. This defect prevention in quality assurance differs subtly from defect detection and rejection in quality control and has been referred to as a shift left since it focuses on quality earlier in the process (i.e., to the left of a linear process diagram reading left to right).

The terms “quality assurance” and “quality control” are often used interchangeably to refer to ways of ensuring the quality of a service or product. For instance, the term “assurance” is often used as follows: Implementation of inspection and structured testing as a measure of quality assurance in a television set software project at Philips Semiconductors is described. The term “control”, however, is used to describe the fifth phase of the Define, Measure, Analyze, Improve, Control (DMAIC) model. DMAIC is a data-driven quality strategy used to improve processes.

Quality assurance comprises administrative and procedural activities implemented in a quality system so that requirements and goals for a product, service or activity will be fulfilled. It is the systematic measurement, comparison with a standard, monitoring of processes and an associated feedback loop that confers error prevention. This can be contrasted with quality control, which is focused on process output.

Quality assurance includes two principles: “Fit for purpose” (the product should be suitable for the intended purpose); and “right first time” (mistakes should be eliminated). QA includes management of the quality of raw materials, assemblies, products and components, services related to production, and management, production and inspection processes. The two principles also manifest before the background of developing (engineering) a novel technical product: The task of engineering is to make it work once, while the task of quality assurance is to make it work all the time.

Quality Assurance Process steps:

  • Plan: Organization should plan and establish the process related objectives and determine the processes that are required to deliver a high-Quality end product.
  • Do: Development and testing of Processes and also “do” changes in the processes
  • Check: Monitoring of processes, modify the processes, and check whether it meets the predetermined objectives
  • Act: A Quality Assurance tester should implement actions that are necessary to achieve improvements in the processes

Concepts of Productivity, Modes of calculating productivity

Productivity is an overall measure of the ability to produce a good or service. More specifically, productivity is the measure of how specified resources are managed to accomplish timely objectives as stated in terms of quantity and quality. Productivity may also be defined as an index that measures output (goods and services) relative to the input (labor, materials, energy, etc., used to produce the output). As such, it can be expressed as:

Hence, there are two major ways to increase productivity: increase the numerator (output) or decrease the denominator (input). Of course, a similar effect would be seen if both input and output increased, but output increased faster than input; or if input and output decreased, but input decreased faster than output.

Organizations have many options for use of this formula, labor productivity, machine productivity, capital productivity, energy productivity, and so on. A productivity ratio may be computed for a single operation, a department, a facility, an organization, or even an entire country.

Productivity is an objective concept. As an objective concept it can be measured, ideally against a universal standard. As such, organizations can monitor productivity for strategic reasons such as corporate planning, organization improvement, or comparison to competitors. It can also be used for tactical reasons such as project control or controlling performance to budget.

Productivity is also a scientific concept, and hence can be logically defined and empirically observed. It can also be measured in quantitative terms, which qualifies it as a variable. Therefore, it can be defined and measured in absolute or relative terms. However, an absolute definition of productivity is not very useful; it is much more useful as a concept dealing with relative productivity or as a productivity factor.

Productivity is useful as a relative measure of actual output of production compared to the actual input of resources, measured across time or against common entities. As output increases for a level of input, or as the amount of input decreases for a constant level of output, an increase in productivity occurs. Therefore, a “productivity measure” describes how well the resources of an organization are being used to produce input.

Productivity is often confused with efficiency. Efficiency is generally seen as the ratio of the time needed to perform a task to some predetermined standard time. However, doing unnecessary work efficiently is not exactly being productive. It would be more correct to interpret productivity as a measure of effectiveness (doing the right thing efficiently), which is outcome-oriented rather than output-oriented.

Productivity is usually expressed in one of three forms: partial factor productivity, multifactor productivity, and total productivity.

Productivity refers to the physical relationship between the quantity produced (output) and the quantity of resources used in the course of production (input).

“It is the ratio between the output of goods and services and the input of resources consumed in the process of production.”

Productivity is the ratio between output of wealth and input of resources used in production processes.

Productivity = Measure of output / Measure of Input

Total Productivity:

Pt = Qt / (L+C+R+M)

where Pt: Total productivity

L = Labour input

C = Capital input

R = Raw material and purchased parts input

M = Other miscellaneous goods and services input factors

Qt = Total output

Productivity can be increased by:

  • Generating more outputs from same level of inputs.
  • Producing same level of outputs with reduced level of inputs.
  • A combination of both.

Importance of Productivity:

The concept of productivity is of great significance for undeveloped and developing countries. In both the cases there are limited resources that should be used to get the maximum output i.e. there should be tendency to perform a job by cheaper, safer and quicker ways.

The aim should be optimum use of resource so as to provide maximum satisfaction with minimum efforts and expenditure. Productivity analysis and measures indicate the stages and situations where improvement in the working of inputs is possible to increase the output.

The productivity indicators can be used for different purposes viz. comparison of performances for various organizations, contribution of different input factors, bargaining with trade unions etc.

Factors Affecting:

Productivity is the outcome of several factors. These factors are so interrelated that it is difficult to identify the effect of any one factor on productivity.

These factors may broadly be divided as follows:

  1. Human:

Human nature and human behaviour are the most significant determinants of productivity.

Human factors may further be classified into two categories as given below:

(a) Ability to work: Productivity of an organization depends upon the competence and calibre of its people both workers and managers. Ability to work is governed by education, training, experience, aptitude, etc. of the employees.

(b) Willingness to work: Motivation and morale of people is the second important group of human factors that determine productivity. Wage incentive schemes, labour participation in management, communication system, informal group relations, promotion policy, union management relations, quality of leadership, etc., are the main factors governing employees’ willingness to work. Working conditions like working hours, sanitation, ventilation, schools, clubs, libraries, subsidized canteen, company transport, etc., also influence the motivation and morale of employees.

  1. Technological:

Technological factors exercise significant influence on the level of productivity.

(a) Size and capacity of plant

(b) Product design and standardization

(c) Timely supply of materials and fuel

(d) Rationalization and automation measures

(e) Repairs and maintenance

(f) Production planning and control

(g) Plant layout and location

(h) Materials handling system

(i) Inspection and quality control

(j) Machinery and equipment used

(k) Research and development

(l) Inventory control

(m) Reduction and utilization of waste and scrap, etc.

  1. Managerial:

The competence and attitudes of managers have an important bearing on productivity. In many organizations, productivity is low despite latest technology and trained manpower. This is due to inefficient and indifferent management. Competent and dedicated managers can obtain extraordinary results from ordinary people.

Job performance of employees depends on their ability and willingness to work. Management is the catalyst to create both. Advanced technology requires knowledge workers who in turn work productively under professionally qualified managers. No ideology can win a greater output with less effort. It is only through sound management that optimum utilization of human and technical resources can be secured.

  1. Natural:

Natural factors such as physical, geological, geographical and climatic conditions exert considerable influence on productivity, particularly in extractive industries. For example, productivity of labour in extreme climates (too cold or too hot) tends to be comparatively low. Natural resources like water, fuel and minerals influence productivity.

  1. Sociological:

Social customs, traditions and institutions influence attitudes towards work and job. For instance, bias on the basis of caste, religion, etc., inhibited the growth of modern industry in some countries. The joint family system affected incentive to work hard in India. Close ties with land and native place hampered stability and discipline among industrial labour.

  1. Political:

Law and order, stability of Government, harmony between States, etc. are essential for high productivity in industries. Taxation policies of the Government influence willingness to work, capital formation, modernization and expansion of plants, etc. Industrial policy affects the size, and capacity of plants. Tariff policies influence competition. Elimination of sick and inefficient units helps to improve productivity.

  1. Economic:

Size of the market, banking and credit facilities, transport and communication systems, etc. are important factors influencing productivity.

Productivity is an economics term which refers to the ratio of product to what is required to produce the product. Productivity is outcome of several interrelated factors. All the factors which are related to input and output components of a production process are likely to affect productivity.

So, there are many factors which can influence productivity; such as internal and external. Knowing the internal and external factors that affect productivity of an Industrial organization; give industrial engineers; the intelligence, they needs to sort out the low performance of resources and make strategic plans for the future.

The best thing about internal factors is that you can control many of them. External factors are all those things that are beyond your control. To deal with all these factors we need different people and variety of techniques and methods.

Edward Deming Quality Philosophies

Dr. William Edwards Deming (October 14, 1900 – December 20, 1993) was an American engineer, statistician, professor, author, lecturer, and management consultant. Educated initially as an electrical engineer and later specializing in mathematical physics, he helped develop the sampling techniques still used by the U.S. Department of the Census and the Bureau of Labor Statistics.

In his book The New Economics for Industry, Government, and Education Deming championed the work of Walter Shewhart, including statistical process control, operational definitions, and what Deming called the “Shewhart Cycle,” which had evolved into Plan-Do-check-Act (PDCA). Deming is best known for his work in Japan after WWII, particularly his work with the leaders of Japanese industry. That work began in July and August 1950, in Tokyo and at the Hakone Convention Center, when Deming delivered speeches on what he called “Statistical Product Quality Administration”. Many in Japan credit Deming as one of the inspirations for what has become known as the Japanese post-war economic miracle of 1950 to 1960, when Japan rose from the ashes of war on the road to becoming the second-largest economy in the world through processes partially influenced by the ideas Deming taught:

  • Better design of products to improve service
  • Higher level of uniform product quality
  • Improvement of product testing in the workplace and in research centers
  • Greater sales through side [global] markets

Create a constant purpose toward improvement.

  • Plan for quality in the long term.
  • Resist reacting with short-term solutions.
  • Don’t just do the same things better find better things to do.
  • Predict and prepare for future challenges, and always have the goal of getting better.

Adopt the new philosophy.

  • Embrace quality throughout the organization.
  • Put your customers’ needs first, rather than react to competitive pressure and design products and services to meet those needs.
  • Be prepared for a major change in the way business is done. It’s about leading, not simply managing.
  • Create your quality vision, and implement it.

Stop depending on inspections

  • Inspections are costly and unreliable and they don’t improve quality, they merely find a lack of quality.
  • Build quality into the process from start to finish.
  • Don’t just find what you did wrong eliminate the “wrongs” altogether.
  • Use statistical control methods not physical inspections alone to prove that the process is working.

Use a single supplier for any one item.

  • Quality relies on consistency: The less variation you have in the input, the less variation you’ll have in the output.
  • Look at suppliers as your partners in quality. Encourage them to spend time improving their own quality, they shouldn’t compete for your business based on price alone.
  • Analyze the total cost to you, not just the initial cost of the product.
  • Use quality statistics to ensure that suppliers meet your quality standards.

Improve constantly and forever.

  • Continuously improve your systems and processes. Deming promoted the Plan-Do-Check-Act
  • approach to process analysis and improvement.
  • Emphasize training and education so everyone can do their jobs better.
  • Use kaizen
  • as a model to reduce waste and to improve productivity, effectiveness, and safety.

Use training on the job.

  • Train for consistency to help reduce variation.
  • Build a foundation of common knowledge.
  • Allow workers to understand their roles in the “big picture.”
  • Encourage staff to learn from one another, and provide a culture and environment for effective teamwork.

Implement leadership

  • Expect your supervisors and managers to understand their workers and the processes they use.
  • Don’t simply supervise provide support and resources so that each staff member can do his or her best. Be a coach instead of a policeman.
  • Figure out what each person actually needs to do his or her best.
  • Emphasize the importance of participative management and transformational leadership.
  • Find ways to reach full potential, and don’t just focus on meeting targets and quotas.

Eliminate fear

  • Allow people to perform at their best by ensuring that they’re not afraid to express ideas or concerns.
  • Let everyone know that the goal is to achieve high quality by doing more things right and that you’re not interested in blaming people when mistakes happen.
  • Make workers feel valued, and encourage them to look for better ways to do things.
  • Ensure that your leaders are approachable and that they work with teams to act in the company’s best interests.
  • Use open and honest communication to remove fear from the organization.

Break down barriers between departments.

  • Build the “internal customer” concept, recognize that each department or function serves other departments that use their output.
  • Build a shared vision.
  • Use cross-functional teamwork to build understanding and reduce adversarial relationships.
  • Focus on collaboration and consensus instead of compromise.

Get rid of unclear slogans.

  • Let people know exactly what you want don’t make them guess. “Excellence in service” is short and memorable, but what does it mean? How is it achieved? The message is clearer in a slogan like “You can do better if you try.”
  • Don’t let words and nice-sounding phrases replace effective leadership. Outline your expectations, and then praise people face-to-face for doing good work.

Eliminate management by objectives

  • Look at how the process is carried out, not just numerical targets. Deming said that production targets encourage high output and low quality.
  • Provide support and resources so that production levels and quality are high and achievable.
  • Measure the process rather than the people behind the process.

Remove barriers to pride of workmanship.

  • Allow everyone to take pride in their work without being rated or compared.
  • Treat workers the same, and don’t make them compete with other workers for monetary or other rewards. Over time, the quality system will naturally raise the level of everyone’s work to an equally high level.

Implement education and self-improvement.

  • Improve the current skills of workers.
  • Encourage people to learn new skills to prepare for future changes and challenges.
  • Build skills to make your workforce more adaptable to change, and better able to find and achieve improvements.

Make “transformation” everyone’s job.

  • Improve your overall organization by having each person take a step toward quality.
  • Analyze each small step, and understand how it fits into the larger picture.
  • Use effective change management principles to introduce the new philosophy and ideas in Deming’s 14 points.

J Juran Quality Philosophies

Joseph Moses Juran (December 24, 1904 – February 28, 2008) was a Romanian-American engineer and management consultant. He was an evangelist for quality and quality management, having written several books on those subjects.

Pareto principle

In 1941, Juran stumbled across the work of Vilfredo Pareto and began to apply the Pareto principle to quality issues (for example, 80% of a problem is caused by 20% of the causes). This is also known as “the vital few and the trivial many.” In later years, Juran preferred “the vital few and the useful many” to signal that the remaining 80% of the causes should not be totally ignored.

For example, he argued that most defects are the result of a small percentage of the causes of all defects, according to the Economist. For another, 20% of a team’s members are going to make up 80% of a project’s successful results. And 20% of a businesses’ customers will create 80% of the profit.

Juran felt organizations, armed with that knowledge, would focus less on meaningless minutiae and more on identifying the 20%. That means eliminating the 20% of mistakes causing the majority of defects, rewarding the 20% of employees causing 80% of the success and serving the 20% of loyal customers that drive sales. In a way, Pareto’s Principle puts numbers to the idea that in business, as in life, things are not evenly distributed. Pareto was studying land ownership in Italy. But Juran saw that it applied to business, as well.

Juran Trilogy

“Goal setting has traditionally been based on past performance. This practice has tended to perpetuate the sins of the past.”

In his focus on people and how they work in processes, Juran took a different approach than others working in the growing quality improvement field. In doing so, he completely changed how companies looked at reducing inefficiencies.

Juran found the hidden costs in how companies tended to deal with defects. In the early 20th century, that often meant dealing with the issue after it had occurred rather than focusing time and money on making quality improvements to keep defects from happening.

He developed the Juran Trilogy, which involved three principal areas:

Quality planning: This involves identifying your customers, determining their needs and developing products that respond to their needs.

Quality improvement: Develop a process to create the product and then optimize that process.

Quality control: Create a process that can operate under minimal inspection.

Quality Planning

Quality Planning is the activity of developing the products and processes required to meet customer’s needs. It involves:

  • Establish quality goals
  • Identify the customers- those who will be impacted by the efforts to meet the goal.
  • Determine the customers’ needs
  • Develop product features that respond to customers’ needs
  • Develop processes that can produce those product features
  • Establish process controls, and transfer the resulting plans to the operating forces

Quality Improvement

This process is the means of raising quality performance to unprecedented levels (breakthrough). This involves:

  • Establish the quality improvement infrastructure
  • Identify the improvement projects
  • For each project establish a project team with clear responsibility
  • Provide the resource, motivation, and training needed by the team

Quality Control

This process consists of the following steps:

  • Evaluate actual quality performance
  • Compare actual performance to quality goals
  • Act on the difference

Quality Improvement 10 Steps proposal

  • Build awareness of the need and opportunity to improve
  • Set goals for that improvement
  • Create plans to reach the goals
  • Provide training
  • Conduct projects to solve problems
  • Report on progress
  • Give recognition for success
  • Communicate results
  • Keep score
  • Maintain momentum

P Crosby’s Quality Philosophies

Philip Bayard “Phil” Crosby, (June 18, 1926 – August 18, 2001) was a businessman and author who contributed to management theory and quality management practices.

Crosby initiated the Zero Defects program at the Martin Company. As the quality control manager of the Pershing missile program, Crosby was credited with a 25 percent reduction in the overall rejection rate and a 30 percent reduction in scrap costs.

The Absolutes of Quality Management

Crosby defined Four Absolutes of Quality Management, which are

  • The First Absolute: The definition of quality is conformance to requirements
  • The Next Absolute: The system of quality is prevention
  • The Third Absolute: The performance standard is zero defects
  • The Final Absolute: The measurement of quality is the price of non-conformance

Fourteen Steps to Quality Improvement

  1. Management Commitment

Make it clear that management is committed to quality.

  1. Quality Improvement Teams

Form Quality Improvement Teams with senior representatives from each department.

  1. Measure Processes

Measure processes to determine where current and potential quality problems lie.

  1. Cost of Quality

Evaluate the cost of quality and explain its use as a management tool.

  1. Quality Awareness

Raise the quality awareness and personal concern of all employees.

  1. Correct Problems

Take actions to correct problems identified through previous steps.

  1. Monitor Progress

Establish progress monitoring for the improvement process.

  1. Train Supervisors

Train supervisors to actively carry out their part of the quality improvement program.

  1. Zero Defects Day

Hold a Zero Defects Day to reaffirm management commitment.

  1. Establish Improvement Goals

Encourage individuals to establish improvement goals for themselves and their group.

  1. Remove Fear

Encourage employees to tell management about obstacles to improving quality.

  1. Recognize

Recognize and appreciate those who participate.

  1. Quality Councils

Establish Quality Councils to communicate on a regular basis.

  1. Repeat the Cycle

Do it all over again to emphasize that the quality improvement process never ends.

Zero Defects

Crosby’s Zero Defects is a performance method and standard that states that people should commit themselves too closely monitoring details and avoid errors. By doing this, they move closer to the zero defects goal. According to Crosby, zero defects was not just a manufacturing principle but was an all-pervading philosophy that ought to influence every decision that we make. Managerial notions of defects being unacceptable and everyone doing ‘things right the first time’ are reinforced.

The Quality Vaccine

Crosby explained that this vaccination was the medicine for organizations to prevent poor quality.

  • Integrity: Quality must be taken seriously throughout the entire organization, from the highest levels to the lowest. The company’s future will be judged by the quality it delivers.
  • Systems: The right measures and systems are necessary for quality costs, performance, education, improvement, review, and customer satisfaction.
  • Communication: Communication is a very important factor in an organization. It is required to communicate the specifications, requirements, and improvement opportunities of the organization. Listening to customers and operatives intently and incorporating feedback will give the organization an edge over the competition.
  • Operations: a culture of improvement should be the norm in any organization, and the process should be solid.
  • Policies: policies that are implemented should be consistent and clear throughout the organization.

ABC analysis in Material Management

In materials management, ABC analysis is an inventory categorization technique. ABC analysis divides an inventory into three categories, “A items” with very tight control and accurate records, “B items” with less tightly controlled and good records, and “C items” with the simplest controls possible and minimal records.

The ABC analysis provides a mechanism for identifying items that will have a significant impact on overall inventory cost, while also providing a mechanism for identifying different categories of stock that will require different management and controls.

The ABC analysis suggests that inventories of an organization are not of equal value. Thus, the inventory is grouped into three categories (A, B, and C) in order of their estimated importance.

‘A’ items are very important for an organization. Because of the high value of these ‘A’ items, frequent value analysis is required. In addition to that, an organization needs to choose an appropriate order pattern (e.g. ‘just-in-time’) to avoid excess capacity. ‘B’ items are important, but of course less important than ‘A’ items and more important than ‘C’ items. Therefore, ‘B’ items are intergroup items. ‘C’ items are marginally important.

Distribution of ABC class

ABC class Number of items Total amount required
A 20% 60%
B 20% 20%
C 60% 20%
Total 100% 100%

The ABC concept is based on Pareto’s law.[9] If too much inventory is kept, the ABC analysis can be performed on a sample. After obtaining the random sample, the following steps are carried out for the ABC analysis.

  • Step 1: Compute the annual usage value for every item in the sample by multiplying the annual requirements by the cost per unit.
  • Step 2: Arrange the items in descending order of the usage value calculated above.
  • Step 3: Make a cumulative total of the number of items and the usage value.
  • Step 4: Convert the cumulative total of the number of items and usage values into a percentage of their grand totals.
  • Step 5: Draw a graph connecting cumulative % items and cumulative % usage value. The graph is divided approximately into three segments, where the curve sharply changes its shape. This indicates the three segments A, B and C.

Advantages

  • The ABC method makes sure that the stock turnover ratio is maintained at a comparatively higher level through a systematic control of inventories
  • There is provision to have enough C category stocks to be maintained without compromising on the more important items
  • This method helps businesses to maintain control over the costly items which have large amounts of capital invested in them
  • The storage expenses are cut down considerably with this tool
  • It provides a method to the madness of keeping track of all the inventory. Not only does it reduce unnecessary staff expenses but more importantly it ensures optimum levels of stock is maintained at all times

Disadvantages

  • It requires a good system of coding of materials already in operation for this analysis to work
  • For this method to work and render successful results, there must be proper standardization in place for materials in the store
  • Since this analysis takes into consideration the monetary value of the items, it ignores other factors that may be more important for your business. Hence, this distinction is vital

Policies

Item A:

  • These are subjected to strict inventory control and are given highly secured areas in terms of storage
  • These goods have a better forecast for sales
  • These are also the items that require frequent reorders on a daily or a weekly basis
  • They are kept as a priority item and efforts are made to avoid unavailability or stock-out of these items

Item B:

  • These items are not as important as items under section A or as trivial as items categorized under C
  • The important thing to note is that since these items lie in between A and C, they are monitored for potential inclusion towards category A or in a contrary situation towards category C

Item C:

  • These items are manufactured less often and follow the policy of having only one of its items, on hand or in some cases they are reordered when a purchase is actually made
  • Since these are low demand goods with a comparatively higher risk of cost in terms of excessive inventory, it is an ideal situation for these items to stock-out after each purchase
  • The questions managers find themselves dealing with when it comes to items in category C is not how many units to keep in stock but rather whether it is even needed to have to these items in store at all

GOLF, XYZ, SOS, HML analysis of Material Management

Golf Analysis

The letter stands for Government, Ordinary, Local and Foreign. There are mainly imported items which are canalized through the State Trading Corporation (STC) Minerals and Metals Trading Corporation, etc. Indian Drugs and Pharmaceutical Ltd (IDPL), Mica trading corporation etc. These are special procedures of inventory control which may not applicable to ordinary items as they require special procedures.

G = Government controlled supplies

O = Open market supplies

L = Local supplies

F = Foreign market supplies

XYZ Analysis

It is based on the closing inventory value of different items. Such classification is done every year at the time of annual stock taking and items having highest inventory- valuation are classified as ‘X’, while those with low investment in them are termed as ‘Z’ items.

Other items are ‘Y items whose inventory value is neither too high nor too low. This type of analysis is particularly useful in identifying the items requiring maximum care and attention during storage.

SOS Analysis

Raw materials, especially agricultural inputs are generally classified by the seasonal, off-seasonal systems since the prices during the season would generally be lower.

The seasonal items which are available only for a limited period should be procured and stocked for meeting the needs of the full year. The prices of the seasonal items which are available throughout the year are generally less during the harvest season.

The quantity required of such items should, therefore, be determined after comparing the cost savings on account of lower prices, if purchased during season, with the higher cost of carrying inventories if purchased throughout the year.

A Buying and stocking strategy for seasonal items depend on a large number of factors and more and more sophistication is taken place in this sphere and operational techniques are used to obtain optimum results.

HML Analysis

HML Classification:

The HML classification is similar to the ABC classification, except for the fact that instead of consumption values of items, their unit values are considered. Items are classified on the basis of their unit values into:

H = High value items.

M = Medium value items.

L = Low value items.

This type of analysis is useful for keeping control over materials consumption at the departmental level. For example, gold, which is a high value item, will be classified as H and coal, which is a low value item, will be classified as L.

error: Content is protected !!