Technical Reviews and Audits

Purpose of Technical Reviews and Audits

For DoD systems development, a properly tailored series of technical reviews and audits provide key points throughout the system development to evaluate significant achievements and assess technical maturity and risk. DoDI 5000.85 and the Adaptive Acquisition Framework Document Identification (AAFDID) Tool identify the statutory and regulatory requirements for acquisition programs. Regardless of acquisition pathway, the PM, Systems Engineer, and Lead Software Engineer work to properly align the applicable technical reviews to support knowledge-based milestone decisions that streamline the acquisition life cycle and save precious taxpayer dollars. Technical reviews and audits allow the PM, Systems Engineer, and Lead Software Engineer to jointly define and control the program’s technical effort by establishing the success criteria for each review and audit. A well-defined program facilitates effective monitoring and control through increasingly mature points.

The Engineering of Defense Systems Guidebook provides guidance on selecting and tailoring technical reviews and audits for each of the AAF pathways. Underpinning most if not all of these technical reviews and audits is the need to conduct a wide range of program-related analyses. Regardless of acquisition pathway, the ability to conduct such analyses can be profoundly impacted by the extent to which the program adopts a Digital Engineering (DE) approach (as described more fully in the Systems Engineering (SE) Guidebook, Section 2.2.2, Digital Engineering). As mentioned there, DoD’s approach to implementing DE is to “securely and safely connect people, processes, data, and capabilities across an end-to-end digital enterprise. This will enable the use of models throughout the lifecycle to digitally represent the system of interest (i.e., SoS, processes, equipment, products, parts) in the virtual world.”

The extent to which a program adopts a DE approach will not impact “what” technical reviews and audits need to be conducted, but it can have a profound and revolutionary impact upon “how” they are conducted. A well-defined digital ecosystem, instantiated or leveraged, with an associated authoritative source of truth and static and dynamic models of systems and the battlespace will enable timely and iterative analyses. In addition, by leveraging constructive, virtual, and live simulation tools, the ecosystem can open up the trade space to enable exploration of options not easily analyzed elsewise.

Event-Driven Technical Reviews and Audits

Technical reviews of program progress should be event driven and conducted when the system under development meets the review entrance criteria as documented in the program’s Systems Engineering Plan (SEP). An associated activity is to identify technical risks associated with achieving entrance criteria at each of these points (see the DoD Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs). Systems Engineering (SE) is an event-driven process based on successful completion of key events as opposed to arbitrary calendar dates. As such, the SEP should clarify the timing of events in relation to other SE and program events. While the initial SEP and Integrated Master Schedule (IMS) have the expected occurrence in the time of various milestones (such as overall system Critical Design Review (CDR)), the plan should be updated to reflect changes to the actual timing of SE activities, reviews and decisions. Figure 3-1 of the SE Guidebook provides the end-to-end perspective and the integration of SE technical reviews and audits across all AAF pathways. Technical reviews should be tailored appropriately for other acquisition pathways.

Figure 3-1. Technical Reviews and Audits for the Major Capability Acquisition Life Cycle

Image of the Technical Reviews and Audits for the Major Capability Acquisition Life Cycle Back to top

Support of the Defense Acquisition System

Properly structured, technical reviews and audits support the Defense Acquisition System by:

Back to top

Knowledge-Based Approach

Successful development of a complex system requires a knowledge-based approach. Increasing levels of knowledge are a natural consequence of design maturation; however, successful programs establish a deliberate acquisition approach whereby major investment decision points are supported by requisite levels of knowledge. The GAO study on Assessments of Selected Weapons Programs (GAO-12-400SP) provides quantitative evidence to affirm this best practice.

Back to top

Timing and Tailoring of Technical Reviews and Audits

Technical reviews should occur when the requisite knowledge is expected and required. This section provides guidance on entrance and exit criteria for the level of maturity expected at each technical review and audit. OSD established the expected reviews and audits for each phase of systems development in the outline for the SEP. These policy and guidance documents provide a starting point for the PM, Systems Engineer, and Lead Software Engineer to develop the program’s unique set of technical reviews and audits. Tailoring is expected to best suit the program objectives (see SE Guidebook, Section 1.4, Systems Engineering Policy and Guidance). The SEP captures the output of this tailoring and is reviewed and approved to solidify the program plan.

Programs that tailor the timing and scope of these technical reviews and audits to satisfy program objectives increase the probability of successfully delivering required capability to the warfighter. Technical reviews provide the forum to frame issues and assumptions. They define options necessary to balance risk in support of continued development.

Back to top

Technical Baselines and Technical Reviews

The technical baseline (including the functional, allocated and product baselines) established at the conclusion of certain technical reviews inform all other program activity. Accurate baselines and disciplined reviews serve to integrate and synchronize the system as it matures, which facilitates more effective milestone decisions and ultimately provides better warfighting capability for less money. The technical baseline provides an accurate and controlled basis for:

Back to top

System Developers and Technical Reviews

The PM and the Systems Engineer need to keep in mind that technical reviews and audits provide visibility into the quality and completeness of the developer’s work products. These requirements should be captured in the contract specifications or Statement of Work (SOW). The program office should consider providing the SEP with the Request for Proposal (RFP) and requiring the contractor deliver a Systems Engineering Management Plan (SEMP) that is consistent with the SEP. As a best practice, the SEMP should include entrance criteria and associated design data requirements for each technical review and audit. The configuration and technical data management plans should clearly define the audit requirements.

Contract incentives are frequently tied to completion of technical reviews. Some stakeholders may have a strong incentive to call the review complete as soon as possible. The review chairperson and Systems Engineer should exercise best judgment in an objective, informed manner to ensure the reviews are not prematurely declared complete.

Back to top

Complex Systems and Incremental Reviews and Audits

For complex systems, reviews and audits may be conducted for one or more system elements, depending on the interdependencies involved. These incremental system element-level reviews lead to an overall system-level review or audit. After all incremental reviews are complete, an overall summary review is conducted to provide an integrated system analysis and capability assessment that could not be conducted by a single incremental review. Each incremental review should complete a functional or physical area of design. This completed area of design may need to be reopened if other system elements drive additional changes in this area. If the schedule is being preserved through parallel design and build decisions, any system deficiency that leads to reopening design may result in rework and possible material scrap.

Back to top

Program Protection Planning

To design for system security, the program protection planning and execution activities should be integrated into the systems engineering technical reviews and audits. See Technology and Program Protection (T&PP) Guidebook (forthcoming) Section 5 for system security engineering (SSE) criteria for each technical review and audit.

Back to top

Roles and Responsibilities

For each technical review, a technical review chair is identified and is responsible for evaluating products and determining the criteria are met and action items are closed. The Service chooses the technical review chair, who could be the PM, Systems Engineer, or other subject matter expert selected according to the Service’s guidance. This guidance may identify roles and responsibilities associated with technical reviews and audits. It also may specify the types of design artifacts required for various technical reviews. In the absence of additional guidance, each program should develop and document its tailored design review plan in the SEP.

The following notional duties and responsibilities associated with the PM, Systems Engineer, and Lead Software Engineer should be considered in the absence of specific Service or lower level (e.g., System Command or PEO) guidance:

Back to top

Typical PM Responsibilities for Technical Reviews and Audits:

The PM is typically responsible for:

Back to top

Typical Systems Engineer Responsibilities for Technical Reviews and Audits:

The Systems Engineer is typically responsible for:

Back to top

Figure 3-2: Technical Review Process

Image of the Technical Review Process

Key Stakeholders for Technical Reviews and Audits

The PM, Systems Engineer, and Lead Software Engineer should identify key stakeholders who have an interest or role in the review, which may include:

Back to top

Technical Review and Audit Criteria and Definitions

Specific review criteria are provided in each technical review and audit section below. These criteria should be achieved and all action items closed before a technical review is considered complete. The Systems Engineer may refer to IEEE 15288.2 "Standard for Technical Reviews and Audits on Defense Programs" as a resource. Instructions for how DoD military and civilian employees can access the IEEE 15288.2 via the Acquisition Streamlining and Standardization Information System (ASSIST) are located on the DDR&E(AC)/Engineering website. If a PMO chooses to use IEEE 15288.2, additional guidance for implementing the DoD-adopted systems engineering standard on acquisition programs contracts can be found in the Best Practices for Using Systems Engineering Standards (ISO/IEC/IEEE 15288, IEEE 15288.1, and IEEE 15288.2) on Contracts for Department of Defense Acquisition Programs guidance document. When comparing this section on technical reviews and audits to IEEE 15288.2 keep in mind:

Back to top

Alternative Systems Review (ASR)

The Alternative Systems Review (ASR), typically held for the MCA pathway, is conducted to support a dialogue between the end user and acquisition community. It leads to a draft performance specification for the preferred materiel solution. The ASR typically occurs during the MCA Materiel Solution Analysis (MSA) phase, after completion of the Analysis of Alternatives (AoA) and before Milestone A. It focuses technical efforts on requirements analysis. See SE Guidebook, Section 3.1

Back to top

System Requirements Review (SRR)

The System Requirements Review (SRR) is a multi-disciplined technical review to ensure that the developer understands the system requirements and is ready to proceed with the initial system design. This review assesses whether the system requirements are captured in the system performance specification (sometimes referred to as the System Requirements Document (SRD)). See SE Guidebook, Section 3.2

Back to top Back to top

System Functional Review (SFR)

The System Functional Review (SFR) is held to evaluate whether the functional baseline satisfies the end-user requirements and capability needs and whether functional requirements and verification methods support achievement of performance requirements. At completion of the SFR, the functional baseline is normally taken under configuration control by the government. See SE Guidebook, Section 3.3

Back to top

Preliminary Design Review (PDR)

The Preliminary Design Review (PDR) should provide sufficient confidence to proceed with detailed design. The PDR ensures the preliminary design and basic system architecture are complete, that there is technical confidence the capability need can be satisfied within cost and schedule goals and that risks have been identified and mitigation plans established. It also provides the acquisition community, end user and other stakeholders with an opportunity to understand the trade studies conducted during the preliminary design, and thus confirm that design decisions are consistent with the user’s performance and schedule needs and the validated JCIDS Capability Development Document (CDD), or CDD-like document for other pathways. The PDR establishes the allocated baseline. See SE Guidebook, Section 3.4

Back to top

Critical Design Review (CDR)

The Critical Design Review (CDR) confirms the system design is stable and is expected to meet system performance requirements, and confirms the system is on track to achieve affordability and should-cost goals as evidenced by the detailed design documentation. The CDR establishes the initial product baseline. See SE Guidebook, Section 3.5

Back to top

System Verification Review (SVR) / Functional Configuration Audit (FCA)

The System Verification Review (SVR) is the technical assessment point at which the actual system performance is verified to meet the requirements in the system performance specification and is documented in the functional baseline. The Functional Configuration Audit (FCA) is the technical audit during which the actual performance of a system element is verified and documented to meet the requirements in the system element performance specification in the allocated baseline. Further information on FCA can be found in MIL-HDBK-61, Configuration Management Guidance. SVR and FCA are sometimes used synonymously when the FCA is at the system level. See SE Guidebook, Section 3.6

Back to top

Production Readiness Review (PRR)

The Production Readiness Review (PRR) for the system determines whether the system design is ready for production, and whether the developer has accomplished adequate production planning for entering Low-Rate Initial Production (LRIP) and Full-Rate Production (FRP) for the MCA pathway. Production readiness increases over time with incremental assessments accomplished at various points in the life cycle of a program. See SE Guidebook, Section 3.7

Back to top

Physical Configuration Audit (PCA)

The Physical Configuration Audit (PCA) is a formal examination of the "as-built" configuration of the system or a configuration item against its technical documentation to establish or verify its product baseline. The objective of the PCA is to resolve any discrepancies between the production-representative item that has successfully passed Operational Test and Evaluation (OT&E) and the associated documentation currently under configuration control. A successful PCA provides the Milestone Decision Authority (MDA) with evidence that the product design is stable, the capability meets end-user needs and production risks are acceptably low. At the conclusion of the PCA, the final product baseline is established and all subsequent changes are processed by formal engineering change action. Further information can be found in MIL-HDBK-61, Configuration Management Guidance. SVR and FCA are sometimes used synonymously when the FCA is at the system level. See SE Guidebook, Section 3.8

Test Readiness Review (TRR)

The TRR is used to assess a contractor’s readiness for testing configuration items, including hardware and software. It typically involves a review of earlier or lower-level test products and test results from completed tests and a look forward to verify the test resources, test cases, test scenarios, test scripts, environment and test data have been prepared for the next test activity. For programs using the MCA pathway, TRRs typically occur in the EMD and P&D phases. A TRR provides the formal approval authority with a review showing that the system is ready to enter the test and that the funding and execution of a test executes the test and gathers the required information. TRRs assess test objectives, test methods and procedures, test scope, safety, and whether test resources have been properly identified and coordinated. TRRs are also intended to determine if any changes are required in planning, resources, training, equipment, or timing to successfully proceed with the test. If any of these items are not ready, senior leadership may decide to proceed with the test and accept the risk, or mitigate the risk in some manner.

Back to top

Resources

Key Terms


Back to top

Policy and Guidance


Articles

Back to top

Training

On this page

  1. Purpose of Technical Reviews and Audits
  2. Event-Driven Technical Reviews and Audits
  3. Support of the Defense Acquisition System
  4. Knowledge-Based Approach
  5. Timing and Tailoring of Technical Reviews and Audits
  6. Technical Baselines and Technical Reviews
  7. System Developers and Technical Reviews
  8. Complex Systems and Incremental Reviews and Audits
  9. Roles and Responsibilities for Technical Reviews and Audits
  10. Typical PM Responsibilities for Technical Reviews and Audits:
  11. Typical Systems Engineer Responsibilities for Technical Reviews and Audits:
  12. Key Stakeholders for Technical Reviews and Audits
  13. Technical Review and Audit Criteria and Definitions
  14. Alternative Systems Review (ASR)
  15. System Requirements Review (SRR)
  16. System Functional Review (SFR)
  17. Preliminary Design Review (PDR)
  18. Critical Design Review (CDR)
  19. System Verification Review (SVR) / Functional Configuration Audit (FCA)
  20. Production Readiness Review (PRR)
  21. Physical Configuration Audit (PCA)
  22. Test Readiness Review (TRR)
  23. Resources
Back to top