A T T A C H M E N T 1 8

К оглавлению
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 
34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 
51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 
68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 
102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 
119 120 121 122 123 

CRITICALSUCCESS FACTOR

(CSF) MATRIX

It is necessary to track each Critical Success Factor (CSF) from requirement to

implementation. The tracking of CSFs is a little more complex in that each CSF

must be tracked into each unit to which it should be applied. Further, proof

must be supplied along the way and in the final system test that each CSF is

being met.

If you do not have such a checklist, an outline follows that you can employ.

Modify Table A18-1 for your own needs.

T a b l e A 1 8 - 1 — C r i t i c a l S u c c e s s F a c t o r ( C S F ) M a t r i x

CSF Unit A Unit B Unit C Unit D Final Proof

MTTR 0.5 hrs 0.5 hrs 0.5 hrs 0.5 hrs RMA Analysis

0.5 hrs Para 3.2.1

MTBF 30,000 hrs 30,000 hrs 30,000 hrs 30,000 hrs RMA Analysis

30,000 hrs Para 3.2.2

BIBLIOGRAPHY

Books

Blanchard, Kenneth and Spencer Johnson. The One Minute Manager. New

York: William Morrow and Co., Inc., 1982.

de Bono, Edward. Serious Creativity. New York: Harper Business, 1992.

Fishman, George S. Monte Carlo: Concepts, Algorithms, and Applications. New

York: Springer-Verlag, 1996.

Guffey, Mary Ellen. Business Communication: Process and Product. 2nd ed. Cincinnati:

South-Western College Publishing, 1997.

McDermott, Robin E., et al. The Basics of FMEA. Portland, Ore.: Productivity

Press, Inc., 1996.

271

272 B I B L I O G R A P H Y

Rothenberg, Robert. The Plain-Language Law Dictionary. New York: Penguin,

1996.

Rubenstein, Reuven Y. Simulation and the Monte Carlo Method. New York:

John Wiley, 1981.

Senge, Peter. The Fifth Discipline: The Art and Practice of the Learning Organization.

New York: Doubleday, 1990.

Sobol, Ilya M. A Primer for the Monte Carlo Method. Boca Raton, Fla.: CRC

Press, LLC, 1994.

Stamatis, Dean H. Failure Mode and Effect Analysis: FMEA from Theory to

Execution. Milwaukee, Wis.: ASQ Quality Press, 1995.

U.S. Army Field Manual (FM) 770-78. Available from the Superintendent of

Documents, Government Printing Office (GPO), Washington, D.C., or

the Consumer Information Center, Pueblo, Colo., and in digital form,

online at: www.incose.org/stc/fm77078.htm.

Articles

Barnes, Brenda J., and James W. Van Wormer, Ph.D. ‘‘Process Thinking and

the 85:15 Rule Applied to Education.’’ Source: www.grandblancommu

nityschools.com/qip/processthinking.htm (last accessed Aug. 5, 2002).

Chen, P. ‘‘The Entity-Relationship Model: Toward a Unified View of Data.’’

ACM Transactions on Database Systems, 1, no. 1 (1976): 9–36.

Luttman, Robert & Associates Online Articles, ‘‘Cause and Effect.’’ Source:

www.robertluttman.com/cause-effect.html (last accessed Aug. 5, 2002).

Patrick, Francis S. ‘‘Program Management—Turning Many Projects into Few

Priorities with TOC.’’ Newtown Square, Pa.: Project Management Institute,

1999. (Project Management Institute Seminar/Symposium [30th :

1999 : Philadelphia, Pa.], PMI 1999 Annual Seminars & Symposium Proceedings.)

Plsek, P.E. ‘‘Management and Planning Tools of TQM.’’ Quality Management

in Health Care 1, no. 3 (Spring 1993): 59–72.

B I B L I O G R A P H Y 273

Private Documents

Early, John F., ed. ‘‘Cause-Effect Diagrams.’’ Quality Improvement Tools. Wilton,

Conn.: Juran Institute, 1989. The training kit entitled Quality Improvement

Tools is produced by the Juran Institute, and is a part of their

inventoried items.

Systems Application Architecture—Common User Access Guide to User Interface

Design. IBM Corporation, 1991. IBM Document Number SC34-4289.

Available through IBM field offices.

The Windows Interface Guidelines for Software Design. Redmond, Wash.: Microsoft

Press, 1995. ISBN 1556156790. Available from Best Buy Books.

TRADEMARKS

Brainstorming_ is a trademark of Infinite Innovations, Ltd.

DOORSrequireIT_ is a trademark of Telelogic DOORS, North America

EDGE Diagrammer_ and EDGE Programmer_ are trademarks of Pacestar

Software

Flowcharting Cause & Effect Module for Six Sigma Software Suite_ is a trademark

of Quality America, Inc.

MacWrite_ is a trademark of Apple Corporation

MBTI_ is a trademark of Consulting Psychologists, Inc.

Microsoft_, Microsoft Word_, MS Word_, Microsoft Excel_, MS Excel_,

Microsoft Access_, MS Access_, Microsoft Office_, MS Office_, Microsoft

Works_, MS Works_ are trademarks of Microsoft Corporation

PathMaker_ is a trademark of SkyMark

PMBN_ is a trademark of Best Practices, LLC.

Post-it_ is a trademark of 3M Company

REASON 4_ is a trademark of DECISION Systems, Inc.

Root Cause Analysis (RCA)_ is a trademark of Root Cause Analyst

Six Sigma for Excel_ is a trademark of BaRaN Systems LLC.

SmartDraw_ is a trademark of SmartDraw.com

This Page Intentionally Left Blank

INDEX

Affinity Diagrams, 166, 171–176, 201

purpose of, 171–172

Relationship Diagrams, 173–176, 201

software to support, 173, 175–176

Alliances, 11, 26–30, see also Subcontracts;

Teaming, Alliances, and Subcontracts

analysis of Causes for Action, 177–186,

201

eliminating holes and overlaps, 186–187

Failure Mode Effect Analysis (FMEA),

121, 166, 170, 182–184, 201

Force Field Analysis, 180–182, 201

Monte Carlo Simulation, 184–186, 192,

201

Pareto Analysis, 170, 178–180, 192, 193,

201

275

Architecture, 39, 41–42

conceptual unity, 42, 117

Critical Success Factors (CSF) understood,

41, 112–113

design issues in, 43–44, 122

key functions covered, 42, 114–115

major elements described and justified,

42, 115–116

modules/subsystems well defined, 41,

113–114

recovery issues, 74, 110, 112–117

user interfaces well defined, 42, 116–117

Arthur, Jay, 179–180, 187n

back tracing, 140–141

BaRaN Systems, LLC, 173

TEAMFLY

276 I N D E X

Barnes, Brenda J., 166–167, 176n, 180

benchmark research

on Cause Descriptions, 157, 158, 160–

161, 201

sharing benchmarks, 160–161

Best Practices, LLC, 161

Blanchard, Kenneth, 158, 164n

Blue Slipping, 178

brainstorming

of Cause Descriptions, 157, 158–160,

201

software for, 159–160

Brassboard, 133

Breadboard, 133

Budget Reviews, 16, 30, 66, 89, 90, 94, 96

budgets

allocations for, 115

mix of personnel and, 79–81

salaries and wages of personnel, 34, 98

Business Process Improvement (BPI), 192

Business Process Redesign (BPR), 192

buying-in, 15

Capability Matrix, 249–250

Capability Maturity Model (CMM), 152

Cause and Effect Diagrams, 166, 167–171,

201

development of, 167–170

software to support, 170–171

Cause Descriptions

analysis of, 177–186

eliminating holes and overlaps in,

186–187

expanding, 156–164, 201

implementing, 188–190

interrelationships of, 176

ordering techniques for, 165–176, 201

Programmatic Performance Checklist

(PPC), 10–37

Programmatic Recovery Checklist

(PRC), 58–108

selecting, 187

Technical Performance Checklist

(TPC), 41–55

Technical Recovery Checklist (TRC),

112–155

Change Control Process, 47–48, 133

changes

configuration, 54

Prototype, 47–48

charter, updating, 7

Chen, P., 173–175, 176n

Christensen, David S., 90

company

company data and, 196

Policies, Plans, and Processes and,

24–25

competence

of personnel, 33–34, 95–97, 102

of vendors, 31, 91–92

Configuration Management, 40, 53–54

Configuration Management Plan

(CMP) in, 40, 53–54, 150–152,

234–236

recovery issues, 111, 148, 150–154

Review Board approval of change requests,

54, 152–153

version controls and, 54, 153–154

Configuration Management (Control)

Board (CCB), 151

Consensus Gram, 178

Contact Line Item Numbers (CLINs), 10,

14, 63

Contract Data Requirements List (CDRL),

10, 14, 35–36

Contract/Subcontract, outline of, 231–233

corporation

corporate data and, 196

Policies, Plans, and Processes and,

24–25

Cost of Quality Position, 31, 92

Critical Design Review (CDR), 118

Critical Success Factors (CSFs), 41, 43,

112–113, 144, 269–270

culture, of ‘‘generation X,’’ 7

I N D E X 277

customer

acceptance of final delivery by, 36–37,

106–107

approval of Design Review, 45, 127

approval of System Test Plan/Procedure,

52

customer data and, 196

determining needs of, 59–67, 70–71

Customer Meetings, 15–16, 29–30, 65, 90,

96

Customer Processes

Causes for Action and, 167, 201

Design and, 42–43, 117–119

Production/Manufacturing and, 49–50,

138–139

Project/Program Plan and Technical

Plan linkage, 23–24, 75–76, 166

researching, 162

Data Delivery Matrix, 247–248

Data Item Descriptions (DIDs), 35–36,

132, 147, 152–154, 162

Data Management, 12

amount of data on time, 35–36,

103–104

recovery issues, 58, 103–104

Data Plan, 35–36, 103–104, 146–147

Data Sheets, 170

data trail, importance of, 23–24, 195–196

de Bono, Edward, 160

DECISION Systems, Inc., 171

Deming, W. Edwards, 170

Department of Defense (DOD) standards,

162

Design, 39, 42–45

architecture-level issues in, 43–44, 122

correctness of, 43, 119–120

efficiency of, 43, 120–122

Life Cycle Cost (LCC) and, 44–45,

123–124

recovery issues, 74, 110, 117–125

segmentation of, 44, 122–123

Technical Performance Measures

(TPMs) and, 45, 124–125

traceable to processes, 42–43, 117–119

Design Reviews, 16, 22, 29, 39, 42–43, 45,

66, 88, 90, 96, 122, 150

completion according to required processes,

45, 125–127

customer approval of, 45, 127

Design Review Approval Form, 127,

128, 263–274

recovery issues, 74, 110, 125–127

Design To Cost (DTC) approach, 44, 124

documentation

data trail in, 23–24, 195–196

in Design Review, 127

importance of, 193–195

interrelationships in, 5

library in, 194–195

of prototype changes, 47–48, 133

of Specifications, 68

standard, 196

Statement of Work (SOW), 10–12, 59–

60, 63–64

DOORS, 27–28

drop shipping, 37, 107–108

‘‘Dumping’’ the Fix, 190

Early, John F., 170

Earned Value Measurement System

(EVMS), 90

EIA-649, 151–152, 153

80/20 Rule, 192

85:15 Rule, 166–167, 192, 193, 201

Employee Assistance Programs (EAPs),

100

enterprise data, 118, 196

Enterprise Processes

Causes for Action and, 167, 201

Design and, 42–43, 117–119

Production/Manufacturing and, 49–50,

138–139

Project/Program Plan and Technical

Plan linkage, 24–25, 76–77, 162

researching, 162

278 I N D E X

Entity-Relationship Diagrams, 175

ethics, 43, 63, 86

Expansion Methodologies, 156–164, 201

Experience Window, 253–254

FAA Standards, 162

Failure Mode and Criticality Analysis

(FMECA), 121, 166, 170, 182–184,

201

Failure Mode Effect Analysis (FMEA), 121,

166, 170, 182–184, 201

development of, 182–183

software for, 183

Family of Causes, 157

fast-tracking, 80

Final Delivery, 12

acceptance by customer without delay,

36–37, 106–107

avoidance of third-party or drop shipping,

37, 107–108

recovery issues, 58, 106–108

First Articles, 47, 133

Fishbone Diagram, 167–171

Fishman, George S., 186

Flow Charting, 178

FMECA—Failure Mode and Criticality

Analysis, 121, 182–184

Force Field Analysis, 180–182, 201

development of, 180–182

software for, 182

Functional Manager, 81

G&A (General and Administrative) expenses,

80

Gallery Walking, 178

Guffey, Mary Ellen, 156–157, 164n

hardware, Test Plans, 142–143

hazardous conditions, 100–101

Histograms, 170, 178

holes

creation of, 133–136

eliminating, 186–187

IBM, User Access Guide, 42, 54n

IEEE, 132

implementing Cause Descriptions,

188–190

‘‘Dumping’’ the Fix, 190

‘‘On-Ramps,’’ 189–190

‘‘Slipping in the Fix,’’ 189, 190

incremental construction, 47, 132

Independent Research and Development

(IR&D) programs, 19, 61–62, 69–70

Infinite Innovations Ltd., 159

In-Process Reviews (IPRs), 16–17, 29, 33,

39, 45–46, 66, 89, 90, 95, 96, 118

approval by appropriate authority, 46,

129–130

completion according to required processes,

45, 127–129

In-Process Review Approval Form, 130,

265–266

recovery issues, 74, 110, 127–130

Interfaces, 68, 72

Interface Control Document (ICD),

115, 123

user, 42, 116–117

interpersonal conflict, 34–35, 82, 99–101

Ishikawa, Kaoru, 167–168

Ishikawa Diagram, 167–171

ISO-9000, 132

ISO-9001, 23–25, 75–77

ISO-10007, 151–152, 153

ISO/IEC 12207, 151–152, 153

Janus process, 53–54, 151

Jiro, Kwakita, 172

key functions, defining, 42, 114–115

KJ Method, 171–176

leave time, 97–98

legal responsibility, 1

Lewin, Kurt, 180–181

Life Cycle Cost (LCC), 44–45, 123–124

Light Voting, 178

I N D E X 279

line design, 50, 139

liquidated damages, 141

load, for System Tests, 53, 55n, 149

Lotus Flower Diagrams, 178

Lowest Replaceable Unit (LRU), 183

Luttman, Robert, 176n

major elements, description of, 42,

115–116

Manhattan Project, 2, 184

Materials, 11–12, 30–33

Production/Manufacturing and, 50,

140–141

Purchase Order monitoring, 31–33, 50,

92–95

Purchase Order preparation, 30–31, 50,

91

recovery issues, 57–58, 91–95

vender performance and, 33, 91–92, 95

vendor competence and, 31, 91–92

Materials Manager, 31

matrix management, 81, 97, 98

MBWA (Management by Walking

Around), 33

McDermott, Robin E., 184

Mean Time Between Failure (MTBF), 41,

112–113

Mean Time To Repair (MTTR), 41,

112–113

metrics, 6, 17

Microsoft, Interface Guidelines, 42, 54n

Milestone Reviews, 22

MIL-STD-61, 151–152

MIL-STD-100, 23–25, 75–77

MIL-STD-245, 12, 60, 64, 71

MIL-STD-483, 151–152, 153

MIL-STD-490, 18, 69

MIL-STD-498, 151–152, 153

MIL-STD-973, 132, 151–154

MIL-STD-1423, 132

MIL-STD-1521, 126–127, 131, 132

MIL-STD-1629, 121

minutes, of meetings, 68, 127

Mission Statement

Organization and, 26

Policies, Plans, and Processes and, 24

modifying methods, 196

modules/subsystems

defining, 41, 113–114

design and, 51, 143

monitoring

of Purchase Orders, 31–33, 50, 92–95

of Specification (Spec), 21–22, 72–73

of Statement of Work (SOW), 15–16,

65–66

in Teaming, Alliances, and Subcontracts,

29, 88–89

Monte Carlo Simulation, 184–186, 192,

201

development of, 184–185

software for, 186

MTBF (Mean Time Between Failure), 41,

112–113

MTTR (Mean Time To Repair), 41,

112–113

Municipal Government Standards, 162

NASA

SpecsIntact, 68

standards of, 162

negotiation

Negotiation Checklist, 267–268

of Specification (Spec), 20–21, 71–72

of Statement of Work (SOW), 14–15,

64–65

in Teaming, Alliances, and Subcontracts,

28–29, 87–88

‘‘On-Ramps,’’ 189–190

On the Job Training (OJT), 97, 102

ordering Causes for Action

Affinity Diagrams, 166, 171–176, 201

Cause and Effect Diagram, 166, 167–

171, 201

85:15 Rule, 166–167, 192, 193, 201

Relationship Diagrams, 173–176, 201

280 I N D E X

Organization, 11, 25–26

mix of personnel, 26, 78–82, 98

number of personnel, 25, 77–78

recovery issues, 57, 77–83

teamwork of personnel, 26, 82–83

out-of-tolerance, 66–67, 94

overlaps

creation of, 133–136

eliminating, 186–187

overtime costs, 80

Pacestar Software, 173, 175

Pareto, Vilfredo, 178

Pareto Analysis, 170, 178–180, 192, 193,

201

development of, 178–180

software for, 180

Pareto Principle, 178–180, 192, 193

Performance Characteristics, 68, 71–72

Personnel, 12, 33–35

availability when needed, 34, 97–98

competence for tasks, 33–34, 95–97,

102

interpersonal conflict and, 34–35, 82,

99–101

mix of, 26, 78–82, 98

number of, 25, 77–78

recovery issues, 58, 95–101

salaries/wages equal to or less than bid,

34, 98

for System Tests, 53, 149–150

teamwork of, 26, 82–83

see also Training

PERT Charts, 174

Phoenix Award, xiv, 199

Physical Characteristics, 68, 72

Plans, Progress, and Problems Meetings,

30, 89–90

Plsek, P. E., 170

Policies, Plans, and Processes, 11, 23–25

Customer Processes, 23–24, 75–76, 162

Enterprise Processes, 24–25, 76–77, 162

Policy-to-Plan Trail, 251–252

Project/Program Processes, 162–164,

201

recovery issues, 57, 74–77

Standard Processes, 23, 74–75, 161–162

Policy-to-Plan Trail, 251–252

Posttest Reviews, 29, 89

Preliminary Design Review (PDR), 22–23,

118

Pre Planned Product Improvement (P3I),

44, 124

Pretest Meetings, 29, 89

Probability Density Functions (PDFs), 185

problem-solving process, 72, 156–157

Problem Test Reports (PTRs), 52, 144–146

Production/Manufacturing, 40, 49–50

line design for, 50, 139

Materials in, 50, 140–141

recovery issues, 74, 111, 138–141

shop orders in, 50, 139–140

traceability of, 49, 138–139

Profit and Loss (P&L) responsibility, 1

program

defined, 1

project versus, 1–2

requirements control matrix, 4

Programmatic Performance Checklist

(PPC), 9–37

Data Management assertions, 12, 35–36

Final Delivery assertions, 12, 36–37

Materials assertions, 11–12, 30–33

Organization assertions, 11, 25–26

Personnel assertions, 12, 33–35

Policies, Plans, and Processes assertions,

11, 23–25

Quality assertions, 12, 36

Specification assertions, 11, 17–23

Statement Of Work (SOW) assertions,

10–17

Teaming, Alliances, and Subcontracts

assertions, 11, 26–30

Training assertions, 12, 35

Programmatic Recovery Checklist (PRC),

9, 56–108

I N D E X 281

Data Management assertions, 58,

103–104

Final Delivery assertions, 58, 106–108

Materials assertions, 57–58, 91–95

Organization assertions, 57, 77–83

Personnel assertions, 58, 95–101

Policies, Plans, and Processes assertions,

57, 74–77

Quality assertions, 58, 104–106

Specification assertions, 57, 67–74

Statement Of Work (SOW) assertions,

57, 59–67

Teams, Alliances, and Subcontracts assertions,

57, 83–90

Training assertions, 58, 101–103

Program Office (PO), 6

Program Test Plan (PTP), 51

Progress Reviews, 29, 30, 88, 89

project

defined, 1

program versus, 1–2

project data and, 196

requirements control matrix, 4

Project Advisory Council, 24–25

Project Management Benchmarking Network

(PMBN), 160–161

Project/Program Plan, 43

outline of, 219–222

overview, 4

Technical Plan and, 23–25, 74–77

Project/Program Processes, researching,

162–164, 201

Project Reviews, frequency of, 15–16

Prototypes, 39, 46–48

Change Control Process and, 47–48,

133

changes accepted by originator of requirements,

47–48, 133

incremental construction of, 47, 132

recovery issues, 74, 110, 130–133

reflection of requirements, 46, 130–132

Purchase Orders (POs), 5–6, 40, 48–49

completeness of, 49, 136–138

monitoring of, 31–33, 50, 92–95

preparation of, 30–31, 50, 91

recovery issues, 74, 111, 135–138

sum of all purchases in, 48–49, 135–136

Qualification Requirements, 68, 72

Quality, 12, 36

characteristics of, 36, 105

measurement of, 36, 105–106

Quality Plan, 36, 104–105

recovery issues, 58, 104–106

Quality America, Inc., 171

Quality Assurance Plan, 31, 68, 92, 104,

237–240

Quality Control Plan, 104

Quality Standards, 36

Quantum Improvement (QI), 192–193

Quantum Process Improvement (QPI),

192

Radar Diagrams, 178

rapid prototyping, 7

Reengineering, 192

Relationship Diagrams, 173–176, 201

development of, 173–175

software to support, 175–176

Relex Corporation, 183

requirements, 3

Requirements definition team, 15

Requirements Flow-Down Matrix (RFM)

Architecture and, 42, 114–115

described, 3–5, 245–246

Purchase Orders and, 91, 136, 137

Subcontracts and, 26–28, 83, 84, 134,

135

Requirements Traceability Checklist

(RTC)

Architecture and, 42, 114–115

Design and, 43, 119

Purchase Orders and, 49, 91, 136, 137

Subcontracts and, 48, 83, 84

Unit Tests and, 51

282 I N D E X

Requirements Traceability Matrix (RTM)

Causes for Action and, 74, 187

described, 3–6, 241–244

Design and, 119–122

Specification and, 22, 73, 74

Subcontracts and, 26–28, 48, 134

System Tests and, 52, 147–148

Unit Tests and, 143, 144

Research and Development (R&D), 114,

122

Review Board, change requests and, 54,

152–153

Risk Mitigation Plan, 6, 28, 62, 69–70, 85,

227–230

Root Cause Analyst, 171

Rubenstein, Reuven Y., 185

Run Charts, 178

salaries/wages, relative to bid for project or

program, 34, 98

Scattergrams, 178

schedules, 6

mix of personnel and, 79–81

Schedule Reviews, 16, 30, 33, 65, 66, 89,

90, 96

Search Tables, 7–8, 156, 157, 188–189

Senge, Peter, 170

Senior Advisory Council, 195, 196

shop orders, 50, 139–140

Show Cause letter, 87

Six Sigma, 192

SkyMark, 160, 171, 180, 182

‘‘Slipping in the Fix,’’ 189, 190

SmartDraw.com, 173, 176

Sobol, Ilya M., 186

software

for Affinity Diagrams, 173, 175–176

for brainstorming, 159–160

for Cause and Effect Process, 170–171

for Failure Mode Effect Analysis

(FMEA), 183

for Force Field Analysis, 182

for Monte Carlo Simulation, 186

for Pareto Analysis, 180

for Relationship Diagrams, 175–176

requirements for, 7

Test Plans, 142–143

Software Configuration Management

(SCM), 152

Software Engineering Institute (SEI), 152

Specialty Discipline Studies, 126–127

Specification (Spec), 11, 17–23

capabilities for completing, 18–20,

69–70

definition of, 17–18, 67–69, 83

described, 3–5

interpretation of, 20, 70–71

monitoring of, 21–22, 72–73

negotiation of, 20–21, 71–72

performance of, 22–23, 73–74

Policies, Plans, and Processes and, 23

recovery issues, 57, 67–74

Requirements Traceability Matrix

(RTM) for, 22

subcontracts and, 26, 85–86

topics covered in, 21

types of, 18

see also Quality

SpecsIntact, 68

Stamatis, Dean H., 184

Standard Processes

Causes for Action and, 167, 201

Design and, 42–43, 117–119

Production/Manufacturing and, 49–50,

138–139

Project/Program Plan and Technical

Plan linkage, 23, 74–75, 161–162

researching, 161–162

standards

documentation of, 196

Quality, 36

Standards Traceability Index (STI), 43

Standards Traceability Matrix (STM), 23–

25, 138, 255–258

Statement Of Work (SOW), 10–17

capabilities for completing, 12–13,

60–62

I N D E X 283

definition of, 10–12, 14, 59–60, 83, 118

described, 3–5

interpretation of, 13–14, 62–64

monitoring of, 15–16, 65–66

negotiation of, 14–15, 64–65

performance of, 16–17, 66–67

Policies, Plans, and Processes and, 23

recovery issues, 57, 59–67

Specifications and, 22

subcontracts and, 26, 83, 85

Sterling, John C., 45

Strategic Plan, Policies, Plans, and Processes

and, 24

Subcontractor Meetings, 29, 30, 66, 88, 90,

96

Subcontract Requirements Flow-Down

Matrix (SRFM), 5, 83

Subcontract Requirements Traceability

Matrix (SRTM)

teaming and, 26–28, 83

Unit Tests and, 51

Subcontracts, 5–6, 11, 26–30, 40, 48

capabilities for completing, 28, 84–87

Contract/Subcontract outline, 231–233

definition of, 26–28, 83–84

monitoring of, 29, 88–89

negotiation of, 28–29, 87–88

performance of, 29–30, 89–90

recovery issues, 57, 74, 83–90, 111,

133–135

Specifications in, 26, 85–86

tasks allocated in, 28, 48, 84–87,

133–135

Subcontracts/Purchase Order Status List,

138

Sub Program Offices (SPOs), 2

Subsystem Tests, 144, 150

Unit Tests and, 51, 53, 144

synergy, 158

System Effectiveness Factors, 40, 54, 55n

consideration of all appropriate, 54,

154–155

recovery issues, 111, 154–155

System Tests, 40, 52–53

concurrent tests of all elements, 52,

148–149

loads in, 53, 55n, 149

personnel in, 53, 149–150

procedures approved by customer, 52,

146–147

recovery issues, 74, 111, 146–150

results of prior-level tests and, 53, 144,

150

traceable to requirements, 52, 147–148

Unit Tests and, 51, 53, 144

tasking, 6

Task Qualification Matrix, 69–70

Teaming, Alliances, and Subcontracts, 11,

26–30

monitoring in, 29, 88–89

negotiation in, 28–29, 87–88

Organization and, 26, 82–83

performance of, 29–30, 89–90

recovery issues, 57, 83–90, 111, 133–135

subcontract definition in, 26–28, 83–84

tasks within capabilities, 28, 84–87

see also Subcontracts

Teaming Agreements, 28

Team Meetings, frequency of, 15–16

Technical Interchange Meetings (TIMs),

16, 29, 30, 33, 66, 88, 89, 90, 95, 96

Technical Performance Checklist (TPC),

38–55

Architecture assertions, 39, 41–42

Configuration Management assertions,

40, 53–54

Design assertions, 39, 42–45

Design Review assertions, 39, 45

In-Process Review assertions, 39, 45–46

Production/Manufacturing assertions,

40, 49–50

Prototype assertions, 39, 46–48

Purchase Order assertions, 40, 48–49

Subcontract assertions, 40, 48

284 I N D E X

Technical Performance Checklist (TPC)

(continued)

System Effectiveness Factors assertions,

40, 54, 55n

System Test assertions, 40, 52–53

Unit Test assertions, 40, 51–52

Technical Performance Measures (TPMs),

45, 124–125

Technical Plan, 43

outline of, 223–226

Project/Program Plan and, 23–25,

74–77

Technical Recovery Checklist (TRC),

109–155

Architecture assertions, 110, 112–117

Configuration Management assertions,

111, 150–154

Design assertions, 110, 117–125

Design Review assertions, 110, 125–127

In-Process Review assertions, 110,

127–130

Production/Manufacturing assertions,

111, 138–141

Prototype assertions, 110, 130–133

Purchase Order assertions, 111,

135–138

Subcontract assertions, 111, 133–135

System Effectiveness Factor assertions,

111, 154–155

System Test assertions, 111, 146–150

Unit Test assertions, 111, 141–146

Technology Associates, 186

third-party shipping, 37, 107–108

Tiger Team, 67

Total Quality Leadership (TQL), 192

Total Quality Management (TQM), 170,

192

Training, 12, 35

adequacy of, 35, 101–102

economical, 35, 102–103

On the Job (OJT), 97, 102

mix of personnel and, 81

recovery issues, 58, 101–103

teamwork and, 82–83

Tree Diagram, 168–171

Ulam, Stan, 184

U.S. Army, 84

Unit Test Plan (UTP), 142–143, 144

Unit Tests, 40, 51–52

forwarded to Subsystem and System

Tests, 51, 144

of individual design elements, 51, 143

Problem Test Reports (PTRs) and, 52,

144–146

recovery issues, 74, 111, 141–146

requirements and, 51, 141–143

user interfaces, definition of, 42, 116–117

vacation time, 97–98

Vendor Evaluation Forms, 32, 85, 93, 260,

261, 262

vendors

competence of, 31, 91–92

evaluation of, 32, 85, 93, 260, 261, 262

performance of, 33, 91–92, 95

Vendor/Subcontractor Database, 28

Version Description Document (VDD),

54, 153–154

Vision

Organization and, 26

Policies, Plans, and Processes and, 24

wages/salaries, relative to bid for project or

program, 34, 98

Work Breakdown Structure (WBS)

Architecture and, 42, 113, 114–115

described, 3–5

Design and, 44, 122–123

Prototypes and, 132

Purchase Orders and, 49

Subcontracts and, 48, 83

Work Orders, 141

Work Package Leaders, 6

Work Package (WP), described, 3–6

Wormer, James Van, 166–167, 176n, 180