Reforms in Primary Market

Disclosure of All Material Facts is made Compulsory: SEBI has made it compulsory for companies do disclose all the facts and risk factors regarding the projects undertaken by the company. The basis on which the premium amount is calculated should also be disclosed by the company as per SEBI norms. SEBI also advises the code of ethics for advertising in media regarding the public issue.

Vetting of Offer Document: SEBI vets offer documents to make sure that the company listing the shares has made all disclosures in it. All the guidelines and regulatory measures of capital issues are meant to promote healthy and efficient functioning of the issue market (or the primary market).

Reforms as to Mutual Funds: The Government has now permitted the setting up of private mutual funds and a few have already been set up. UTI has now been brought under the regulatory jurisdiction of SEBI. All mutual funds are allowed to apply for firm allotments in public issues. To improve the scope of investments by mutual funds, the latter are permitted to underwrite public issues. Further, SEBI has relaxed the guidelines for investment in money market instruments. Finally, SEBI has issued fresh guidelines for advertising by mutual funds.

Imposition of Compulsory Deposit on Companies making Public Issues: In order to induce companies to exercise greater care and diligence for timely action in matters relating to the public issues of capital, SEBI has advised stock exchanges to collect from companies making public issues, a deposit of one per cent of the issue amount which could be forfeited in case of non-compliance of the provisions of the listing agreement and, non-dispatch of refund orders and share certificates by registered post within the prescribed time.

Regulation of Merchant Banking: SEBI has brought Merchant banking under its regulatory framework. The merchant bankers are now to be authorized by SEBI. Merchant bankers, now have a greater degree of accountability in the offer document and issue process.

Conditions regarding Application Size etc.: SEBI has raised the minimum application size and also the proportion of each issue allowed for firm allotment to institutions such as mutual funds.

Issue of Due Diligence Certificate: The lead managers have to issue due diligence certificate, which has now been made part of the offer document.

Underwriting has made Optional: To reduce the cost of issue in primary market, SEBI has made underwriting of issue optional. However, the condition that if an issue was not underwritten and was not able to collect 90% of the amount offered to the public, the entire amount collected would be refunded to the investor is still in force.

Increase of Popularity to Private Placement Market: In recent years, private placement market has become popular with issuers because of stringent entry and disclosure norms for public issues. Besides low cost of issuance, ease of structuring investments and saving of time lag in issuance are the other causes responsible for the rapid growth of private placement market.

Encouragement to Initial Public 0ffers: In order to encourage Initial Public Offers (IPO) in the primary market, SEBI has permitted companies to determine the par value of shares issued by them. SEBI has allowed issues of IPOs to go for “Book Building” i.e. reserve and allot shares to individual investors. But the issuer will have to disclose the price, the issue size and the number of securities to be offered to the public.

Disclosure of All Material Facts is made Compulsory: SEBI has made it compulsory for companies do disclose all the facts and risk factors regarding the projects undertaken by the company. The basis on which the premium amount is calculated should also be disclosed by the company as per SEBI norms. SEBI also advises the code of ethics for advertising in media regarding the public issue.

Deming’s Application Prize

The Deming Application Prize is an annual award presented to a company that has achieved distinctive performance improvements through the application of TQM. Regardless of the types of industries, any organization can apply for the Prize, be it public or private, large or small, or domestic or overseas. Provided that a division of a company manages its business autonomously, the division may apply for the Prize separately from the company.

There is no limit to the number of potential recipients of the Prize each year. All organizations that score the passing points or higher upon examination will be awarded the Deming Application Prize.

The Deming Prize is the longest-running and one of the highest awards on TQM (Total Quality Management) in the world. It recognizes both individuals for their contributions to the field of Total Quality Management (TQM) and businesses that have successfully implemented TQM. It was established in 1951 to honor W. Edwards Deming who contributed greatly to Japan’s proliferation of statistical quality control after World War II. His teachings helped Japan build its foundation by which the level of Japan’s product quality has been recognized as the highest in the world, was originally designed to reward Japanese companies for major advances in quality improvement. Over the years it has grown, under the guidance of the Japanese Union of Scientists and Engineers (JUSE) to where it is now also available to non-Japanese companies, albeit usually operating in Japan, and also to individuals recognized as having made major contributions to the advancement of quality. The awards ceremony is broadcast every year in Japan on national television.

Two categories of awards are made annually, the Deming Prize for Individuals and the Deming Prize.

Companies Qualified for Receiving the Prize

The Deming Application Prize is given to applicant companies or divisions of companies (applicant companies hereafter) that effectively practice TQM suitable to their management principles, industry, business and scope. More specifically, the following evaluation criteria are used for the examination to determine whether or not the applicant companies should be awarded the Prize:

  • Reflecting their management principles, industry, business, scope and business environment, the applicants have established challenging and customer-oriented business objectives and strategies under their clear management leadership.
  • TQM has been implemented properly to achieve business objectives and strategies as mentioned Item 1) above.
  • As an outcome of Item 2), outstanding results have been obtained for business objectives and strategies as stated in Item 1).

ISO 9000, QS 9000

ISO 9000 is a globally recognized set of quality management standards developed by the International Organization for Standardization (ISO). These standards help organizations establish and maintain an effective quality management system (QMS) to improve efficiency, customer satisfaction, and overall business performance. The ISO 9000 series is applicable to companies of all sizes and industries, ensuring that products and services meet regulatory and customer requirements.

What is ISO 9000?

ISO 9000 refers to a series of international standards that define the principles and guidelines for implementing a Quality Management System (QMS). The primary focus of ISO 9000 is customer satisfaction, process improvement, and continuous quality enhancement.

Key Elements of ISO 9000:

  1. Standardized QMS Framework: Provides guidelines for an effective quality management system.
  2. Process-Oriented Approach: Focuses on optimizing business processes to improve efficiency.
  3. Continuous Improvement: Encourages ongoing enhancements in quality practices.
  4. Customer Satisfaction: Ensures that customer needs and expectations are met consistently.
  5. Compliance with Regulations: Helps organizations meet legal and regulatory requirements.

ISO 9000 Family of Standards

ISO 9000 series includes multiple standards, each serving a specific purpose in quality management:

A. ISO 9000:2015 – Fundamentals and Vocabulary

  • Defines the basic concepts, principles, and terminologies related to quality management.
  • Provides a foundational understanding of QMS requirements.

B. ISO 9001:2015 – Quality Management System Requirements

  • The most widely used standard in the ISO 9000 family.
  • Specifies the requirements for establishing, implementing, maintaining, and improving a QMS.
  • Organizations can obtain ISO 9001 certification to demonstrate compliance with quality standards.

C. ISO 9004:2018 – Quality Management for Sustainable Success

  • Provides guidelines for achieving long-term quality improvement and business success.
  • Focuses on stakeholder satisfaction beyond customer needs.

D. ISO 19011:2018 – Guidelines for Auditing Management Systems

  • Offers guidance on internal and external audits for quality management systems.
  • Helps organizations conduct effective audits to ensure compliance and improvement.

Principles of ISO 9000

ISO 9000 is built on seven key quality management principles that guide organizations in implementing a strong QMS:

1. Customer Focus

  • The primary goal of quality management is to meet customer requirements and enhance satisfaction.
  • Organizations must understand customer needs and exceed expectations.

2. Leadership

  • Strong leadership is essential for setting clear objectives and ensuring employee engagement in quality initiatives.
  • Leaders must create a culture of continuous improvement.

3. Engagement of People

  • Employee involvement is critical to quality improvement.
  • Organizations should encourage teamwork, training, and skill development.

4. Process Approach

  • Identifying and managing interrelated processes improves efficiency and consistency.
  • A structured approach leads to better quality control.

5. Continuous Improvement

  • Organizations must adopt a mindset of ongoing improvement in products, services, and processes.
  • Regular performance evaluations help identify areas for enhancement.

6. Evidence-Based Decision Making

  • Quality management should be driven by data, facts, and analysis rather than assumptions.
  • Organizations must use performance metrics to improve decision-making.

7. Relationship Management

  • Maintaining strong relationships with suppliers, stakeholders, and customers ensures long-term success.
  • Organizations should work collaboratively to enhance quality outcomes.

Benefits of ISO 9000 Certification

Achieving ISO 9001 certification offers several advantages to organizations:

A. Operational Efficiency

  • Helps streamline processes, reducing inefficiencies and waste.
  • Enhances productivity through a structured QMS framework.

B. Improved Product and Service Quality

  • Ensures that products and services consistently meet customer requirements.
  • Reduces defects, rework, and customer complaints.

C. Increased Customer Satisfaction

  • A customer-centric approach enhances trust and loyalty.
  • Meeting quality expectations leads to positive brand reputation.

D. Global Market Access

  • ISO 9001 certification is recognized internationally, enabling businesses to expand globally.
  • Many clients and governments require suppliers to be ISO certified.

E. Regulatory Compliance

  • Helps organizations comply with industry regulations and legal requirements.
  • Reduces the risk of fines, penalties, and legal disputes.

F. Competitive Advantage

  • Certified organizations gain a competitive edge over non-certified businesses.
  • Customers prefer companies that follow standardized quality management practices.

Steps to Implement ISO 9001:2015

Organizations must follow a systematic approach to implement ISO 9001:2015 effectively:

Step 1: Understanding Requirements

  • Familiarize yourself with ISO 9001:2015 clauses and principles.
  • Assess current quality management practices.

Step 2: Management Commitment

  • Leadership must support and allocate resources for implementation.
  • Appoint a Quality Manager to oversee the process.

Step 3: Documentation and QMS Development

  • Develop a Quality Manual outlining policies, objectives, and processes.
  • Document work instructions and standard operating procedures (SOPs).

Step 4: Employee Training and Awareness

  • Educate employees about ISO 9001 principles and their role in maintaining quality.
  • Conduct workshops and quality control training programs.

Step 5: Implementation and Process Control

  • Apply documented processes in daily operations.
  • Monitor and control quality metrics to ensure compliance.

Step 6: Internal Audits

  • Conduct regular audits to evaluate QMS effectiveness.
  • Identify non-conformities and take corrective actions.

Step 7: Certification Audit

  • Hire an accredited certification body to assess compliance.
  • If requirements are met, the organization receives ISO 9001 certification.

Step 8: Continuous Improvement

  • Regularly review performance and update quality objectives.
  • Implement corrective and preventive actions for ongoing improvement.

Challenges in ISO 9000 Implementation

  1. High Initial Costs: Setting up a QMS requires investment in training, audits, and documentation.
  2. Employee Resistance: Some employees may resist changes to established processes.
  3. Time-Consuming Process: Implementation and certification take several months.
  4. Ongoing Maintenance: Continuous monitoring and audits are required to sustain certification.

Kepner Tregor Methodology of Problem Solving

Kepner Tregoe is used for decision making:

  • It is a structured methodology for gathering information and prioritizing and
  • evaluating it.
  • It is very detailed and complex method applicable in many areas, which
  • is much broader than just idea selection.
  • It is called also a root cause analysis and decision-making method.
  • It is a step-by-step approach for systematically solving problems, making decisions, and analyzing potential risks.

Access situation (situation appraisal)

  • Identify concerns (problems) by listing them
  • Separate the level of concern (importance, magnitude, level of influence)
  • Set the priority level to measure seriousness of impacts (influence), urgency and growth potential
  • Decide what action to take next (step by step approach)
  • Plan for who is involved, what they will be doing, where they will be involved, when it happened and the extent of involvement (magnitude)

Make decision (A choice between two or more alternatives)

  • Identify what is being decided
  • Establish and classify objectives (main ones, minor ones,..)
  • Separate the objectives into must (must to have) and want (nice to have) categories (we have to assign importance factors from 1-10, where 10 is the most important want objective) and assign criterion rating (weights)
  • Generate the alternatives (we can do it that way or we can take another way as well)
  • Evaluate the alternatives by scoring the wants against the main objective
  • Review adverse (harmful) consequences of your corrective steps (risk evaluation, risk assessment)
  • Make the best possible choice what to do

See the Upcoming (approaching, next to come) and Potential Opportunity: Solutions

  • State the action
  • List the potential opportunities O{op1, op2 ,..,opN}
  • Consider the possible solutions (e.g. the second one)
  • Take the action to address the likely cause/solution
  • Prepare actions to enhance likely (possible) effects

Uncover and handle problems (problem analysis)

  • State the problem (definition and description of the problem)
  • Specify the problem by asking what is and what is not
  • Develop possible causes of the problem
  • Test and verify possible causes
  • Determine the most probable cause (root cause)
  • Verify any assumptions
  • Try the best possible solution and monitor what will be a situation after applied correctives step

Problem Analysis

Confirm True Cause

  • What can be done to verify any assumptions made?
  • How can this cause be observed at work?
  • How can we demonstrate the cause-and effect relationship (e.g. Current Reality Tree or Ishikawa Fishbone Diagram)?
  • When corrective act

Lean Thinking

Lean thinking is a business methodology that aims to provide a new way to think about how to organize human activities to deliver more benefits to society and value to individuals while eliminating waste. The term “lean thinking” was coined by James P. Womack and Daniel T. Jones to capture the essence of their in-depth study of Toyota’s fabled Toyota Production System. Lean thinking is a way of thinking about an activity and seeing the waste inadvertently generated by the way the process is organized. It uses the concepts of:

  • Value
  • Value streams
  • Flow

In the business world and has evolved in two different directions:

  • Lean thinking converts who keep seeking to understand how to seek dynamic gains rather than static efficiencies. For this group of thinkers, lean thinking continuously evolves as they seek to better understand the possibilities of the way opened up by Toyota and have grasped the fact that the aim of continuous improvement is continuous improvement. Lean thinking as such is a movement of practitioners and writers who experiment and learn in different industries and conditions, to lean think any new activity.
  • Lean manufacturing adepts who have interpreted the term “lean” as a form of operational excellence and have turned to company programs aimed at taking costs out of processes. Lean activities are used to improve processes without ever challenging the underlying thinking, with powerful low-hanging fruit results but little hope of transforming the enterprise as a whole. This “corporate lean” approach is fundamentally opposed to the ideals of lean thinking, but has been taken up by a great number of large businesses seeking to cut their costs without challenging their fundamental management assumptions.

Lean thinking would challenge line managers to look differently at their own jobs by focusing on:

  • The workplace: Going and seeing first hand work conditions in practice, right now, and finding out the facts for oneself rather than relying on reports and boardroom meeting. The workplace is also where real people make real value and going to see is a mark of respect and the opportunity to support employees to add value through their ideas and initiative more than merely make value through prescribed work. The management revolution brought by lean thinking can be summed up by describing jobs in terms of Job = Work + Kaizen
  • Value through built-in quality: Understanding that customer satisfaction is paramount and is built-in at every step of the enterprise’s process, from building in satisfying features (such as peace of mind) to correctly building in quality at every production step. Built-in quality means to stop at every doubtful part and to train yourself and others not to pass on defective work, not to do defective work and not to accept defective work by stopping the process and reacting immediately whenever things go wrong.
  • Value streams through understanding “takt” time: By calculating the ratio of open production time to averaged customer demand one can have a clear idea of the capacity needed to offer a steady flow of products. This “takt” rhythm, be it a minute for cars, two months for software projects or two years for a new book leads to creating stable value streams where stable teams work on a stable set of products with stable equipment rather than optimize the use of specific machines or processes. Takt time thinking leads to completely different capacity reasoning than traditional costing and is the key to far more frugal processes.
  • Flow through reducing batch sizes: Every traditional business, whether in production or services, is addicted to batch. The idea is that once work is set up one way, we’d better get on and quickly make as many pieces of work as we can to keep the unit cost down. Lean thinking looks at this differently in trying to optimize the flow of work in order to satisfy real demand now, not imaginary demand next month. By working strenuously on reducing change-over time and difficulty, it is possible to approach the lean thinking ideal of single piece flow. In doing so, one reduces dramatically the general cost of the business by eliminating the need for warehouses, transports, systems, subcontractor use and so on.
  • Pull to visualize takt time through the flow: pulling work from upstream at takt time through visual devices such as Kanban cards is the essential piece that enables lean thinkers to visualize the gaps between the ideal and the actual at the workplace at any time. Pull is what creates a creative tension in the workplace by both edging closer to single-piece-work and by highlighting problems one at a time as they occur so complex situations can be resolved piecemeal. Pull is the basic technique to “lean” the company and, by and large, without pull there is no lean thinking.
  • Seeking perfection through kaizen: The old time sensei used to teach that the aim of lean thinking was not to apply lean tools to every process, but to develop the kaizen spirit in every employee. Perfection is not sought through better, more clever systems or go-it-alone heroes but through a commitment to improve things together step-by-small-step. Kaizen literally means change for the better and Kaizen spirit is about seeking a hundred 1% improvements from everyone every day everywhere rather than one 100% leap forward. The practice of kaizen is what anchors deep lean thinking in people’s minds and which, ultimately, leads to complete transformation. Practising kaizen together builds self-confidence and the collective confidence that we can face our larger challenges and solve our problems together.

Advantages

Worker Satisfaction

Implementing lean principles in your company requires input and participation from your production staff. They are often in the best place to see where waste and inefficiency occurs. Not only do they serve as a resource for you, employees usually respond in a positive way to sincere efforts to involve them in improvement processes. When they see suggestions and ideas incorporated, a sense of ownership and satisfaction about their contribution is more likely to follow.

Eliminates Waste

Lean principles aim to minimize all forms of waste, from sources as varied as material defects to worker ergonomics. Many sources of waste are easy to identify and correct, such as a machine that is out of adjustment, producing a high volume of defects. Other forms of waste include environmental conditions that impede worker efficiency. Better lighting may help a worker read production instructions; moving a file cabinet might eliminate wasted time for a clerk.

Just in Time

JIT is a strategy that suggests large inventories are wasteful of company resources. Business equity tied up in inventories of raw and finished goods interferes with cash flow. Money is also saved through reduced warehousing needs. The perfect JIT scenario would have the raw materials purchased and delivered at the moment production needs them, and the finished product is sold and delivered the moment it comes off the line. While this scenario may be impossible, lean philosophy suggests making improvements toward the ideal.

Competitive Advantage

Beyond simply reducing costs and improving efficiency, lean production techniques introduce systems and develop skills with your staff that support changes in the workplace that new sales create. Space saved on warehousing may be used to add new product lines. The same is true of time savings. Your staff can absorb new work and react quickly to changes in client demand. Producing work quickly, in short iterations, without waste and delivered on time enhances your advantage over your competition.

Disadvantages

Low Margin for Error

JIT principles work best with stable system components. Delivery times for raw and finished goods are known, and the elements of production can be scheduled accordingly. Being overly aggressive with JIT scheduling leaves you vulnerable to systemic bottlenecks. Supplier delivery issues may cut off your raw materials, interrupting your production flow. Maintenance emergencies can reduce your production throughput. Any constraint not accounted for in your JIT planning potentially jeopardizes the entire system. Margin for error and system waste may be difficult to balance.

New Inefficiencies

Lean techniques can be overused. When tracking of productivity and waste starts to impact the time used for production, the solution becomes the problem. When lean principles are first applied, you can expect larger returns than later down the road. It is tempting to push those expectations, but you must examine the value of improvements. If you refine throughput to 1,000 parts an hour in one section that you can supply with only 500 parts from a previous stage, you haven’t improved your result.

Worker Frustration

When a certain level of refinement is met, using lean methods to squeeze more economy from production can discourage workers, reversing positive motivation and undermining your leadership. Trends of backsliding in previous improvements may indicate worker resentment. Striking a balance between stasis and continuous improvement is a challenge in any lean environment. A small business may be more prone to reaching such a refinement because of its less complex nature. Be aware of how incorporated changes affect your staff to gauge how effective further pushes will be.

Malcolm Baldrige National Quality Award (MBNQA)

The Baldrige Excellence Framework has three parts:

(1) The criteria for performance excellence

(2) Core values and concepts

(3) Scoring guidelines.

The framework serves two main purposes:

(1) To help organizations assess their improvement efforts, diagnose their overall performance management system, and identify their strengths and opportunities for improvement

(2) To identify Baldrige Award recipients that will serve as role models for other organizations.

The Malcolm Baldrige National Quality Award (MBNQA) is an award established by the U.S. Congress in 1987 to raise awareness of quality management and recognize U.S. companies that have implemented successful quality management systems. The award is the nation’s highest presidential honor for performance excellence.

Purpose of the Malcolm Baldrige Award

  • Raise awareness about the importance of performance excellence.
  • Recognize companies that show performance excellence and pass on this information to other organizations to tailor it for their own needs.
  • Motivate U.S. companies and organizations to improve their quality standards and strive for excellence.
  • Help companies and organizations embody the competitive spirit and drive the U.S. economy forward.

Three MBNQA awards can be given annually in six categories:

  • Manufacturing
  • Service Company
  • Small Business
  • Education
  • Healthcare
  • Non-profit

The criteria for performance excellence are based on a set of core values:

  • Systems perspective
  • Visionary leadership
  • Customer-focused excellence
  • Valuing people
  • Organizational learning and agility
  • Focus on success
  • Managing for innovation
  • Management by fact
  • Societal responsibility
  • Ethics and transparency
  • Delivering value and results

The questions that make up the criteria represent seven aspects of organizational management and performance:

Organizations that apply for the MBNQA are judged by an independent board of examiners. Recipients are selected based on achievement and improvement in seven areas, known as the Baldrige Criteria for Performance Excellence:

  • Leadership: How upper management leads the organization, and how the organization leads within the community.
  • Strategy: How the organization establishes and plans to implement strategic directions.
  • Customers: How the organization builds and maintains strong, lasting relationships with customers.
  • Measurement, analysis, and knowledge management: How the organization uses data to support key processes and manage performance.
  • Workforce: How the organization empowers and involves its workforce.
  • Operations: How the organization designs, manages, and improves key processes.
  • Results: How the organization performs in terms of customer satisfaction, finances, human resources, supplier and partner performance, operations, governance and social responsibility, and how the organization compares to its competitors.

The three sector-specific versions of the Baldrige framework are revised every two years:

  • Baldrige Excellence Framework (Business/Nonprofit)
  • Baldrige Excellence Framework (Education)
  • Baldrige Excellence Framework (Health Care)

Sigma features, Enablers, Goals, DMAIC/DMADV

Sigma features

Inventory management plays two critical roles in Lean Six Sigma. Firstly, the management of raw materials and semi-finished goods in the lean manufacturing process. Secondly, inventory control of finished goods held in a warehouse by manufacturers. In both cases, you should keep a sufficient stock level to minimize inventory carrying costs.

As a component of lean manufacturing, stock levels of raw materials and semi-finished products need to be controlled and managed. Similar to Japanese kaizen and Just-in-Time principles, just the right amount of stock should be held to keep production schedules going. This will reduce waste and optimize the production workflow.

For finished goods, excess and obsolete inventory are major business issues. These often involve deadstock and seasonality. A typical solution addresses the excess inventory. You might sell them below cost or donate them, for example. But this does not address their root cause.

  • Identify the value that a company will get from Lean Inventory Management.
  • Optimize the flow of inventory through the business by removing obstacles in the way. This comes from the Japanese 5S Lean principle (Sort, Straighten, Sweep, Standardize, Sustain).
  • Move inventory only when requested by the customer. This is adapted from the Kanban Lean principle.
  • Be flexible and adapt to change. This is influenced by the Kaizen Lean principle.
  • Continuously refine your inventory management processes to improve quality, cycle time, efficiency and cost. This is derived from the DMAIC Six Sigma methodology.

Principles

  • Demand management. You should only move inventory upon an order by a customer.
  • Cost and waste reduction. But not to the extent of negatively affecting the customer.
  • Process standardization. Standardizing, for example, on transportation and business processes.
  • Industry standardization. Standardizing on product parts and components.
  • Cultural change. Everyone along the supply chain must work as a team. Similarly, this echoes principles from Just-in-Time manufacturing.
  • Cross-organization collaboration. Teams that cut across the organization can help to understand value better. Thus, it provides a holistic view similar to that of customer.

Enablers Goals

DMAIC/DMADV

DMAIC

Define: the problem, goal, reason the issue needs to be resolved.

Measure: the current state as a baseline and use it as a starting point for improvement.

Analyze: the root cause, identify with data driven tools and validate as to why said issue is happening.

Improve: here you need to identify some creative solutions to get rid of the major root causes, so the problem will be fixed and prevent future similar issues.

Control: here you want to maintain the improvements and sustain the success of those new improvements.

DMADV

DMADV is the acronym for the framework of Design for Six Sigma (DFSS), which is used when developing a brand-new service or product within a business. The acronym stands for Define, Measurement, Analysis, Design, and Verify.

Define: The goal of the new product or service, set realistic and measurable goals, why it is needed.

Measurement: You must know which factors that are critically important; this should include any parameters, including risks, as well as the production process and product capability.

Analysis: Here you develop design alternatives, work with different combinations and outcomes, and select the best components that would work.

Design: This is where a detailed prototype is developed. After this is done, a more detailed version is developed where errors may make it necessary to modify the current version.

Verify: Here is the final step where the newly designed product is taken to the real world test to see if it will work perfectly. Many production runs might be necessary to see if the quality is the absolute highest it can be.

DMAIC and DMADV do have a number of similarities that are worth noting. They both use statistical tools and facts in order find solutions to common quality-related problems and focus on reaching the business and financial goals of an organization. DMAIC and DMADV are implemented by Green Belts, Black Belts and Master Black Belts and are used to reduce defects to fewer than 3.4 per million available opportunities, or Six Sigma. Their solutions are data intensive and based only on hard facts.

The two most widely used Six Sigma methodologies are DMAIC and DMADV.  Both methods are designed so a business process will be more efficient and effective. While both of these methodologies share some important characteristics, they are not interchangeable and were developed for use in differing business processes.  Before comparing these two approaches in more detail, let’s review what the acronyms stand for.

  • DMAIC: Define, Measure, Analyze, Improve, Control
  • DMADV: Define, Measure, Analyze, Design, Verify

Despite the shared first three letters of their names, there are some notable differences between them. The main difference exists in the way the final two steps of the process are handled. With DMADV, the Design and Verify steps deal with redesigning a process to match customer needs, as opposed to the Improve and Control steps that focus on determining ways to readjust and control the process. DMAIC typically defines a business process and how applicable it is; DMADV defines the needs of the customer as they relate to a service or product.

With regards to measurement, DMAIC measures current performance of a process while DMADV measures customer specifications and needs. Control systems are established with DMAIC in order to keep check on the business’ future performance, while with DMADV, a suggested business model must undergo simulation tests to verify efficacy.

DMAIC concentrates on making improvements to a business process in order to reduce or eliminate defects; DMADV develops an appropriate business model destined to meet the customers’ requirements.

TAGUCHI’s Quality Engineering

The Taguchi method of quality control is an approach to engineering that emphasizes the roles of research and development (R&D), product design and development in reducing the occurrence of defects and failures in manufactured goods.

This method, developed by Japanese engineer and statistician Genichi Taguchi, considers design to be more important than the manufacturing process in quality control, aiming to eliminate variances in production before they can occur.

Taguchi methods are statistical methods, sometimes called robust design methods, developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to engineering, biotechnology, marketing and advertising. Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchi’s development of designs for studying variation, but have criticized the inefficiency of some of Taguchi’s proposals.

Taguchi’s work includes three principal contributions to statistics:

  • A specific loss functions
  • The philosophy of off-line quality control
  • Innovations in the design of experiments

The Taguchi method gauges quality as a calculation of loss to society associated with a product. In particular, loss in a product is defined by variations and deviations in its function as well as detrimental side effects that result from the product.

Loss from variation in function is a comparison of how much each unit of the product differs in the way it operates. The greater that variance, the more significant the loss in function and quality. This could be represented as a monetary figure denoting how usage has been impacted by defects in the product.

Taguchi’s use of loss functions

Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss functions. Reacting to Fisher’s methods in the design of experiments, Taguchi interpreted Fisher’s methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher’s work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests.

However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.

Such losses are, of course, very small when an item is near to negligible. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, unknown and unknowable, but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations:

  • Larger the better (for example, agricultural yield);
  • Smaller the better (for example, carbon dioxide emissions); and
  • On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons:

  • It is the first “symmetric” term in the Taylor series expansion of real analytic loss-functions.
  • Total loss is measured by the variance. For uncorrelated random variables, as variance is additive the total loss is an additive measurement of cost.
  • The squared-error loss function is widely used in statistics, following Gauss’s use of the squared-error loss function in justifying the method of least squares.

Taguchi’s rule for manufacturing

Taguchi realized that the best opportunity to eliminate variation of the final product quality is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:

  • System design
  • Parameter (measure) design
  • Tolerance design

System design

This is design at the conceptual level, involving creativity and innovation.

Parameter design

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi’s radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimize the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification.

Robust parameter designs consider controllable and uncontrollable noise variables; they seek to exploit relationships and optimize settings that minimize the effects of the noise variables.

Tolerance design

With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions.

Management of interactions

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for “control factors” or factors in the “inner array”. By combining an inner array of control factors with an outer array of “noise factors”, Taguchi’s approach provides “full information” on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim.

Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a “confirmation experiment” offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the “likelihood” of control factor-by-control factor interactions is greatly reduced, since “energy” is “additive”.

Inefficiencies of Taguchi’s designs

Interactions are part of the real world. In Taguchi’s arrays, interactions are confounded and difficult to resolve.

Statisticians in response surface methodology (RSM) advocate the “sequential assembly” of designs: In the RSM approach, a screening design is followed by a “follow-up design” that resolves only the confounded interactions judged worth resolution. A second follow-up design may be added (time and resources allowing) to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchi’s designs.

Assessment

Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on loss to society, techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been influential in improving manufactured quality worldwide.

Ishikawa Fish Bone, Applications in Organizations

Ishikawa diagrams (also called fishbone diagrams, herringbone diagrams, cause-and-effect diagrams, or Fishikawa) are causal diagrams created by Kaoru Ishikawa that show the potential causes of a specific event.

The fishbone diagram or Ishikawa diagram is a cause-and-effect diagram that helps managers to track down the reasons for imperfections, variations, defects, or failures.

The diagram looks just like a fish’s skeleton with the problem at its head and the causes for the problem feeding into the spine. Once all the causes that underlie the problem have been identified, managers can start looking for solutions to ensure that the problem doesn’t become a recurring one.

can also be used in product development. Having a problem-solving product will ensure that your new development will be popular provided people care about the problem you’re trying to solve. The fishbone diagram strives to pinpoint everything that’s wrong with current market offerings so that you can develop an innovation that doesn’t have these problems.

Common uses of the Ishikawa diagram are product design and quality defect prevention to identify potential factors causing an overall effect. Each cause or reason for imperfection is a source of variation. Causes are usually grouped into major categories to identify and classify these sources of variation.

Advantages

  • Highly visual brainstorming tool which can spark further examples of root causes
  • Quickly identify if the root cause is found multiple times in the same or different causal tree
  • Allows one to see all causes simultaneously
  • Good visualization for presenting issues to stakeholders

Disadvantages

  • Complex defects might yield a lot of causes which might become visually cluttering
  • Interrelationships between causes are not easily identifiable

Variation = Imperfection

When it comes to quality and efficiency, variation is your enemy. Whatever your business is, you don’t want to leave anything up to chance. From the moment your client contacts you, a predictable process should be followed with its aim being complete customer satisfaction. Variation in the process will mean variation in the product.

Fishbone diagrams help you to determine the variables that may enter the equation. They allow you to make your plans so that you know how to deal with them in such a way that the quality of your final product is still up to standard and without significant variation.

Applications

  • To analyze and find the root cause of a complicated problem
  • When there are many possible causes for a problem
  • If the traditional way of approaching the problem (trial and error, trying all possible causes, and so on) is very time consuming
  • The problem is very complicated and the project team cannot identify the root cause

Construct a Fishbone diagram

Here are the various tasks involved in constructing a Fishbone diagram:

  1. Define the problem
  2. Brainstorm
  3. Identify causes

Define the problem

The first step is fairly simple and straightforward. You have to define the problem for which the root cause has to be identified. Usually the project manager or technical architect, we will refer to this role as the leader throughout the rest of the article decides which problem to brainstorm. He has to choose the problems that are critical, that need a permanent fix, and that are worth brainstorming with the team. The leader can moderate the whole process.

After the problem is identified, the leader can start constructing the Fishbone diagram. Using a sheet of paper, she defines the problem in a square box to the right side of page. She draws a straight line from the left to the problem box with an arrow pointing towards the box. The problem box now becomes the fish head and its bones are to be laid in further steps. At the end of the first step, the Fishbone diagram looks like:

Brainstorm

People have difficulty understanding how to structure the thought process around a large problem domain. Sometimes it is useful to focus on logically related items of the problem domain and to represent them in the Fishbone diagram, which will convey the problem solving methodology. There are quite a few tools available that can help us in this regard, including:

  • Affinity Chart
    Organizes facts, opinions, ideas, and issues into a natural grouping. This grouping is in turn used as an aid in diagnosing complex problems.
  • Brainstorming
    Gathers ideas from people who are potential contributors. This process is discussed further in the following sections.
  • Check sheet
    Acts as a simple data recording device that helps to delineate important items and characteristics to direct attention to them and verify that they are evaluated.
  • Flow charts
    Organizes information about a process in a graphical manner and makes it clear who is impacted at every stage.

No single methodology is applicable to all problem domains. Based on experience and study, you can identify, thoroughly analyze, and maintain the methodology and the related problem domains. In the example given later in this article, we use brainstorming as the problem solving methodology.

Categorize

When you apply the Fishbone technique to business problems, the possible causes are usually classified into six categories:

  • Method
  • Man
  • Management
  • Measurement
  • Material
  • Machine

Though the above are a few important problem categories, during the brainstorming session, the team is encouraged to come up with all possible categories. The above categories give the team direction to help find the possible causes. Some of the categories listed above may or may not be applicable to software or to Domino in particular. Let’s look briefly at each category.

Category Description
Method Methods are ways of doing things or the procedures followed to accomplish a task. A typical cause under the Method category is not following instructions or the instructions are wrong.
Man People are responsible for the problem. The problem may have been caused by people who are inexperienced, who cannot answer prompted questions, and so on.
Management Management refers to project management; poor management decisions, such as upgrading two components simultaneously rather than deploying changes serially may cause technical problems.
Measurement Measurement refers to metrics that are derived from a project. Problems may occur if measurements are wrong or the measurement technique used is not relevant.
Material Material basically refers to a physical thing. A bad diskette is one typical example. Software can’t always handle errors caused by bad material, for instance a bad backup tape, so while material may be the least likely cause, it is a possible cause.
Machine A machine in software usually refers to the hardware, and there are a lot of possibilities that a problem can be due to the machine, such as performance issues.

Other possible categories include policies, procedure, technology, and so on.

After identifying a problem, the leader initiates a discussion with the project team to gather information about the possible causes, finally arriving at the root cause. The team can either analyze each of the above categories for possible causes or look into other categories (not listed above).

Identify causes

While brainstorming, the team should strive toward identifying major causes (categories) first, which can be further discussed, and then secondary causes for each major cause can be identified and discussed. This helps the team to concentrate on one major cause at a time and to refine further for possible secondary causes.

After the major causes (usually four to six) are identified, you can connect them as fishbones in the Fishbone diagram. They are represented as slanted lines with the arrow pointing towards the backbone of the fish. See Figure 2 later in this article.

Sometimes it is difficult to arrive at a few major causes. The team may come up with a lot of causes, which makes brainstorming more difficult. In this case, the leader can assign 10 points to each team member for each possible cause, and let them assign the rating (from 1 to 10, 10 being most likely cause) to each cause. After everyone on the team has rated the causes, the project manager totals each of the causes and ranks them based on their ratings. From the list, the top four to six causes are identified as major causes and connected as bones in the diagram.

The diagram looks a little like the skeleton of a fish, hence the name Fishbone. After the major causes of the problem are identified, each one of them is discussed in further detail with the team to find out the secondary causes. If needed, the secondary causes are further discussed to obtain the next level of possible causes. Each of the major causes is laid as a fishbone in the diagram and the secondary causes as “bonelets.”

The diagram now has a comprehensive list of possible causes for the problem, though the list may not be exhaustive or complete. However, the team has enough information to begin discussing the individual causes and to analyze their relevance to the problem. The team can use analytical, statistical, and graphical tools to assist in evaluating each of the causes. The Pareto principle (explained in part two of this article series) is also used to find the elements that cause major problems and to list them as major causes in the Fishbone diagram. Software metrics that are obtained during application support can also be used here for further assistance.

Evaluate, decide, and take action

It may be very difficult to come up with consensus on a large team for one possible root cause, but the majority is taken into consideration. Also, the major causes can be ranked in order with the most likely cause at the top of the list.

After the evaluation process is complete, the action plan has to be decided. If one possible root cause is identified, then the action plan has to be derived to rectify it. Sometimes, it may be difficult to arrive at the root cause; there may be a few possible root causes. In this case, the action plan has to be drawn for each of the possible root cause.

After the action plan is ready, the leader can designate an individual or team to work on the plan and to rectify the problem permanently. If there are a few possible root causes, all the action plans are to be executed, and the most likely root cause is identified and fixed.

Characteristics of Quality, Quality Assurance

Generally, it can be said that product is of satisfactory quality, if it satisfies the consumers/user. The consumer will buy a product or service only if it suits his requirements.

Therefore, consumers’ requirements are first assessed by marketing department and then the quality decision is taken on the basis of the information collected.

Eight dimensions of product quality management can be used at a strategic level to analyze quality characteristics. The concept was defined by David A. Garvin, formerly C. Roland Christensen Professor of Business Administration at Harvard Business School (died 30 April 2017). Garvin was posthumously honored with the prestigious award for ‘Outstanding Contribution to the Case Method’ on 4 March 2018.

Some of the dimensions are mutually reinforcing, whereas others are not improvement in one may be at the expense of others. Understanding the trade-offs desired by customers among these dimensions can help build a competitive advantage.

  • Performance: Performance refers to a product’s primary operating characteristics. This dimension of quality involves measurable attributes; brands can usually be ranked objectively on individual aspects of performance.
  • Features: Features are additional characteristics that enhance the appeal of the product or service to the user.
  • Reliability: Reliability is the likelihood that a product will not fail within a specific time period. This is a key element for users who need the product to work without fail.
  • Conformance: Conformance is the precision with which the product or service meets the specified standards.
  • Durability: Durability measures the length of a product’s life. When the product can be repaired, estimating durability is more complicated. The item will be used until it is no longer economical to operate it. This happens when the repair rate and the associated costs increase significantly.
  • Serviceability: Serviceability is the speed with which the product can be put into service when it breaks down, as well as the competence and the behavior of the service person.
  • Aesthetics: Aesthetics is the subjective dimension indicating the kind of response a user has to a product. It represents the individual’s personal preference.
  • Perceived Quality: Perceived Quality is the quality attributed to a good or service based on indirect measures.

Quality Characteristics:

An element which makes a product/item fit for use is the quality characteristics. The quality characteristics also mean a process by which the fitness for use can be translated into the technologists’ language for managing the quality. The quality characteristics are also classified into categories called ‘parameters’ of fitness for use.

Two such major parameters are known as:

(i) Quality of design

(ii) Quality of conformance

The quality of design is concerned with consumers’ satisfaction by variation in quality of products popularly called “grades”. In contrast the quality of conformance is the extent to which the products/ items and services conform to the intent of design.

The process capability, inspection and process control is involved in achieving this conformance so that product/goods produced meet the pre-decided specifications.

Quality Assurance

Quality assurance (QA) is a way of preventing mistakes and defects in manufactured products and avoiding problems when delivering products or services to customers; which ISO 9000 defines as “part of quality management focused on providing confidence that quality requirements will be fulfilled”. This defect prevention in quality assurance differs subtly from defect detection and rejection in quality control and has been referred to as a shift left since it focuses on quality earlier in the process (i.e., to the left of a linear process diagram reading left to right).

The terms “quality assurance” and “quality control” are often used interchangeably to refer to ways of ensuring the quality of a service or product. For instance, the term “assurance” is often used as follows: Implementation of inspection and structured testing as a measure of quality assurance in a television set software project at Philips Semiconductors is described. The term “control”, however, is used to describe the fifth phase of the Define, Measure, Analyze, Improve, Control (DMAIC) model. DMAIC is a data-driven quality strategy used to improve processes.

Quality assurance comprises administrative and procedural activities implemented in a quality system so that requirements and goals for a product, service or activity will be fulfilled. It is the systematic measurement, comparison with a standard, monitoring of processes and an associated feedback loop that confers error prevention. This can be contrasted with quality control, which is focused on process output.

Quality assurance includes two principles: “Fit for purpose” (the product should be suitable for the intended purpose); and “right first time” (mistakes should be eliminated). QA includes management of the quality of raw materials, assemblies, products and components, services related to production, and management, production and inspection processes. The two principles also manifest before the background of developing (engineering) a novel technical product: The task of engineering is to make it work once, while the task of quality assurance is to make it work all the time.

Quality Assurance Process steps:

  • Plan: Organization should plan and establish the process related objectives and determine the processes that are required to deliver a high-Quality end product.
  • Do: Development and testing of Processes and also “do” changes in the processes
  • Check: Monitoring of processes, modify the processes, and check whether it meets the predetermined objectives
  • Act: A Quality Assurance tester should implement actions that are necessary to achieve improvements in the processes
error: Content is protected !!