Blog
Why verification planning is painful: a structural explanation
The pain in verification planning is not the planning. It is everything that has to happen before the planning can start.
The situation it always becomes
An aerospace program six months before system CDR. The test team starts verification planning in earnest. They open the requirements database and begin working through the shall statements.
Requirement by requirement, the same questions surface. Has this requirement changed since the last baseline? Which version of the design does this requirement describe? Is the verification method still appropriate given how the design has evolved? Does the test facility actually have the capability to run this test, and has anyone confirmed that with the people running the facility?
For most requirements, nobody can answer these questions without asking someone. The test engineer asks the systems engineer. The systems engineer asks the subsystem lead. The subsystem lead checks a document that may or may not be current. An answer comes back two days later.
This is not an unusual program. This is the standard experience.
Why it is always this late
Verification planning is treated as a downstream activity in most programs. Requirements are written. Design proceeds. Somewhere around preliminary design review, the test team is brought in to figure out how to verify what has been designed.
The structural problem with this sequencing is that by the time verification planning starts in earnest, the requirements and the design have both accumulated months of evolution that the verification approach has not tracked.
Requirements written eighteen months ago may have been revised three times. Some revisions changed the verification method. Some made the original shall statement untestable. Some requirements were deferred to a later phase and are no longer in scope. None of this is obvious from looking at the requirements database, because the verification status was not updated to reflect each change.
The design has evolved independently. Analysis results shifted performance margins. Interface definitions were revised between subsystems. The configuration the test team is asked to verify is not the configuration most of the requirements were written against.
Before a single test procedure can be written, the test team has to reconstruct the current state of the program from scattered sources. This is the archaeology phase, and it is where most of the time in verification planning actually goes.
What it actually costs
The obvious cost is schedule. Verification planning that should take six weeks takes twelve because the preparation work was not accounted for.
The less obvious cost is compounding rework. Test procedures written before the archaeology is complete get invalidated when the archaeology surfaces a requirement revision or a configuration change. The procedure has to be revised. If the test has already been run, it may have to be repeated.
Untestable requirements are another cost. Requirements written without verification in mind sometimes cannot be verified with available methods or facilities. In a well-structured program, this would be caught when the requirement is written and the requirement would be revised. In most programs, it is caught during verification planning, close to CDR, when revising a requirement is expensive and disruptive.
The worst cost is false confidence. A verification matrix showing green may include tests run against superseded requirements, procedures that interpreted an ambiguous requirement one way when the design team interpreted it another, or verification methods that were appropriate for an earlier configuration and were never updated when the design changed.
What good looks like
Verification planning is not painful when the connection between requirements, design, and test approach is maintained throughout the program rather than assembled at the end.
This means the verification method is decided at the time the requirement is written. A systems engineer writing a performance requirement should be answering the question "how will this be verified?" at that same moment, not leaving it for the test team to figure out months later.
It also means requirements changes trigger a review of the associated verification records. When a requirement is revised, the test procedures written to that requirement are flagged. Not automatically rewritten, but flagged for review by someone who can assess whether the change affects the test approach. This is a small amount of work done consistently rather than a large amount of work done in a panic before CDR.
When the design model is connected to the requirements baseline, the question "does the current design still satisfy this requirement?" can be answered from the system rather than by asking four people and checking three documents.
A useful diagnostic
Pick twenty requirements at random from your current program.
For each one, try to answer: what is the current verification method, which version of the design does this requirement describe, and when was the verification approach last reviewed relative to the last requirement revision?
If fewer than half can be answered in under five minutes without asking anyone, your verification planning will be expensive. The question is whether you pay that cost now, during planning, or later, in rework.