Professional Documents
Culture Documents
19
Perspectives www.asse.org
Figure 1 Initial Comparative alignment of the competing safety
processes into a cost-efficient manage-
Analysis of TRADOC ment program and office structure was
conducted.
& IMCOM Programs
INFORMATION SOURCES
The first step was to analyze both pro-
grams for commonalities and dissimilari-
ty. Once the combined program elements
were identified, the next step was to apply
kaizen protocols to the two safety pro-
gram elements, eliminate paternal twins
and combine the remaining services into
the appropriate IMCOM SSPs.
Early on, the decision was made to
create a predictive front end to the six
sigma design package going beyond iden-
tifying and fixing problems. This system
was to get upstream to
the fundamental deci- By applying the
sion-making process
and become more basic principles of
active than reactive. lean six sigma and
This meant the stan-
dard DMAIC kaizen and by using
Figure 2 Application process was not
applicable and it
design for six sigma
of Kaizen Protocols was replaced with techniques, align-
DFSS. By using
DFSS, the design ment of the com-
analysis ensued peting safety
using the identify,
design, optimize processes into a
and validate
(IDOV) process.
cost-efficient man-
DFSS focuses on agement program
preventing problems
instead of just fix- and office structure
ing them. Using was conducted.
DFSS, the auditor is
able to proceed further upstream to rec-
ognize design decisions that affect the
quality and cost of all subsequent
activities necessary to build and deliver
the product or service.
Figures 1, 2 (left) and 3 (p. 22),
illustrate the logic flow of the begin-
ning (two variant DNA strains/
The premise was that Safety VIσ would combine programs), the application of kaizen
essential TRADOC (or other senior mission command- protocols and the end result of a hybrid program consist-
er), IMCOM and DOD/DOL voluntary protection pro- ing of common and unique variant DNA strains.
gram processes and associated industrial system metrics Generation 3 (Figure 4, p. 22), illustrates the conver-
while eliminating process duplication (kaizen) and with- sion of remaining TRADOC and IMCOM process/SSP
out increasing the expenditure of limited resources into a hybrid system. The systems presented qualified
beyond the current or previous fiscal year level provided. and quantified pro cesses blended with DOL/DOD VPP.
By applying the basic principles of lean six sigma and To quantify the system, processes were analyzed and
kaizen and by using design for six sigma techniques, various definitions and assumptions were documented.
21
Perspectives www.asse.org
Figure 3 Merging of as the basis for identifying and quantifying process
opportunities and the associated defect versus nondefect,
Program Common Elements a six sigma matrix was created illustrating the processes
as a definitive data set comprising one safety system—
the new hybrid system or division comparison. The
matrix/table provides a six sigma ranking based on the
delta between the sum of the possible process opportuni-
ties and the identified defects per subelement/process
(Figure 5).
With the processes identified and metrics assigned, it
is a matter of selecting those processes that support the
individual SSP and illustrate opportunities and defects
related to OIP/CIP and VPP audits. Additionally, the
metrics must support the higher-level performance meas-
ure reviews established by HQIMCOM and other
MACOMs. A process is defined as an SSP consisting of
subprocesses/tasks while statistical process controls,
defined as metrics, illustrate the variance of the activi-
ties/actions performing the process (subprocesses, tasks)
IAW with established conditions and standards defined
as opportunities.
Process (performance) indicators are standardized,
Figure 4 Final Hybrid assigned to and will represent the individual process
With DOD/DOL OSHA VPP DNA throughout the analysis.
22
Perspectives www.asse.org
Figure 5 centage of training completed for staff
who require safety training, percent-
Defect/sigma Data Set Matrix age of new employees who have com-
pleted new employee orientation,
percentage of motorcycle owners who
have completed the basic riders
course/military sports riders course
training, percent of planned high- and
medium-risk safety evaluations com-
pleted, percentage of identified safety
hazards corrected within a set period
of time, percentage of new opera-
tional changes that included hazard
analysis, percentage of safety behav-
ior observations consistent with
Figure 6 expectations, percentage of organiza-
tions’ annual safety goals achieved
KPI/Business Drivers and percentage of position descrip-
tions that outline health and safety
responsibilities
LAGGING INDICATORS
Examples of lagging indicators
include percentage of Garrison
(AF/NAF) SIGCEN Class A-D
incidents reported, percentage of
Garrison (AF/NAF) SIGCEN Class
A-D injuries reported, percentage of
Garrison (AF/NAF) SIGCEN Class
A-D equipment damage reports,
actual Garrison (AF/NAF) SIGCEN
and contractor OSHA recordable inci-
dent rate (TCIR), actual Garrison
Figure 7 (AF/NAF) SIGCEN and contractor
OSHA days away from work (DART)
Hazard Prevention & Control and actual Garrison (AF/NAF) SIG-
CEN WC claim rate (OWCP).
By using Safety VIσ, Fort Gordon
safety management processes per-
formed at an average 94% compli-
ance with established TRADOC and
IMCOM evaluation criteria for fiscal
year 2008. This equates to a 3.66
sigma. Industrial programs average a
rank between three and four sigma.
Program efficiencies and compliance
increased 36% when compared
•number of organizational safety committee meetings against the FY06 baseline evaluation
conducted; of 59% compliance.
•number of organizational preventive maintenance Applying statistical process controls (SPC) to areas that
events completed. were previously unquantifiable or not even evaluated or
consisted of lagging or “after-the-fact” events (accidents,
LEADING INDICATORS injuries) were now balanced with
Examples of program leading indicators include percent real-time process leading indicators illustrating the health
of reported injuries investigated by installation safety office of the overall safety management process as a summation
division personnel, percentage of identified corrective of the whole and not just a few topics of interest.
actions completed for reported concerns/complaints, per- Safety VIσ has three levels of statistical process
23
Perspectives www.asse.org
Figure 8 importance. The statistical processes are
directly tied into customer needs and are
Sample Performance Management Review defined as command intent and provision
of SSP.
SSP Metric Using KPI & Six Sigma The three levels of importance are
process owner, division/office level and
directorate/command level. The information
provided within the levels is: Why is the
information being provided? Who uses
/reviews the information? What is the infor-
mation telling?
The process owner level contains the
detail of the process analysis. It shows not
only the delta between opportunities and
defects (basis for six sigma), but also the
variation within the process (statistical
process control data). Statistical process
control data illustrate how the process is
operating by tracking normal vs. variation
and allows the process owner to correct to
meet the target opportunity level. Variations
are illustrated through the use of statistical
process control charts (generally in the
form of Cp/Ck charts).
In the division level, there is overall visi-
bility of the composite of the target process-
es, which are rolled up into a subsystem
series of charts and the individual pro-
Figure 9 cesses are combined into and compared
to the MACOM ORI/CIP checklist sec-
Sample Basic Motorcycle Rider Course tions. This is a composite check of the
Statistical Process Control Chart, FY08 processes performed within the MACOM
division. Here variations are identified
across the division subsystem and correc-
tions are made.
In the directorate/command level,
there is overall visibility of the division
and the process owner composites
compared to the MACOM OIR/CIP
checklists. This is a composite check
of the processes performed within
the MACOM divisions compared
to MACOM OIR/CIP and SC/GC
guidance.
The data presented here are reduced
to four vital areas comprising 15 KPIs
and a frequency matrix, which is passed
on to senior leadership in various forms,
one of which is illustrated in Figure 8.
PERFORMANCE MEASUREMENTS
Performance measurements (PMs)
illustrate the key process and process
indicators supporting the formula associ-
ated with the IMCOM performance
management review (PMR) SSP metric.
KPI and PI are used to define the output
24
Perspectives www.asse.org
Figure 10
Program Evaluation, Six Sigma Data
measure of the PM SSP metric using common deriva- health of the underlying subsystems (MACOM Division)
tives—delta installation and the MACOM-established and individual processes.
defects per million opportunities (DPMO). DPMO delta is comparable across all MACOM/HQ
Process indicators illustrate the variance of the indi- elements regardless of ORI/CIP parameters. In creating
vidual process/SSP at the installation, division and defects per opportunity, opportunity is a part while defect
process level supporting the results of the individual is any nonconformance to the part specifications, regard-
PMR SSP metric and answer the question, “Why?” less of how many processes are applied to each part. A
Process indicators are both textual and visual repre- part is defined as a process indicator. A defect is a defect,
sentations of the variance between the units of opportu- which is compared to the unit of opportunities for suc-
nities and the defects identified during a specific time cess. There are no weighted defects. All rankings are yes
period judged against the acceptable defects per million or no, successful or defective, go or no-go. For the pur-
opportunities established by MACOM. poses of this article, a part is defined as a process indica-
When compared to DPMO, process indicators pro- tor (requirements driver).
vide a visual picture of the health of the individual per- An example is the comparisons between the Mission
formance measurement/system, which illustrates the and Garrison Division of the Fort Gordon Installation
25
Perspectives www.asse.org