К оглавлению
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 
34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 
51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 
68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 
102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 
119 120 121 122 123 

It will probably be necessary to start with definitions of the TPMs. Likely,

you will need to decompose the TPM into its constituents and define each of

them. Most TPMs are metrics; that is, they have a measurement (or measurements)

related to a goal or are expressed as a goal. For instance, an error rate of

less than 10-6 bits. In this case, you will need to define the number of bits in a

total transmission and then define the number of errors of transmission. You

will then need to define a method of measurement that will verify the TPM—an

error rate counter, for instance.

This process goes on and on, but I think you see the point—decompose the

TPM into its constituents, define them, and measure them.


53a (NO) All Design Reviews were not completed according to

required processes.

All Design Reviews were not completed according to required processes when

the events of the Design Review are not directly traceable to the requirements

stipulated in standard processes, customer (contract and contract-referenced)

processes, or enterprise processes.


Lay out the enterprise requirements, the customer requirements, and the

standard requirements for Design Reviews. Interrelate all the requirements and

summarize and organize them into a checklist that will drive the Design Reviews

part of the Program Plan. Retain that information to solidify all data trails. You

should end up with a general matrix that will boil down to an outline similar

to the one below. If it is not possible to accommodate the above steps, go directly

to the outline below and use and update it as necessary.

Design Review Package Content and Review Outline:

Mission and Requirements Analysis

ConOps (Concept of Operations)


Functional Flow Analysis

Use Cases

Preliminary Requirements Allocation

System/Cost Effectiveness Analysis

Trade Studies (e.g. addressing system functions in mission and support



Logistics Support Analysis

Specialty Discipline Studies (i.e., hardware and software reliability analysis,

maintainability analysis, armament integration, electromagnetic compatibility,

survivability/vulnerability (including nuclear), inspection methods/techniques

analysis, energy management, environmental considerations):

System Interface Studies

Generation of Specification

Program Risk Analysis

Integrated Test Planning

Producibility Analysis Plans

Technical Performance Measurement Planning

Engineering Integration

Data Management Plans

Configuration Management Plans

System Safety

Human Factors Analysis

Value Engineering Studies

Life Cycle Cost Analysis

Preliminary Manufacturing Plans

Manpower Requirements/Personnel Analysis

Milestone Schedules

Communications Plan

Training Plan

Security (Threat) Analysis

The source for the above list is MIL-STD-1521, Paragraph 10.3, plus embellishment.

Because it is a governmental (DOD) standard, it may well be overly

complex. But it’s a lot easier to eliminate a line item than to create one. Modify

the list for your specific needs.

Additional Resources:


53b (NO) The customer has not approved each Design Review.

The customer will not have approved each Design Review unless the customer

has signed a sheet that confirms that the customer (through a representative,

if necessary) agrees to the Design Review package, the Design Review, and

the Design Review minutes, including Design Review action items. Note: Any

exceptions taken should be included in the Action Items and thus achievable.