An Effective Framework for Verification, Validation, and Accreditation

I recently encountered a formal methodology for conducting VV&A efforts that I think is worthy of your consideration. Briefly, Verification shows whether a system works as specified, Validation shows whether the specification addresses the correct problem, and Accreditation shows whether the system is accepted for its intended use.

The method is based on a specification (MIL-STD-3022) created for the Navy by a senior analyst charged with overseeing such efforts. I was part of a large team that completed an 18-month project to achieve full accreditation for a planning tool the Navy was adopting to manage its entire fleet of F-18-series aircraft. I expect the tool will be adapted to manage other types of aircraft as well, and perhaps other types of equipment.

The details of the actual specification are less important here; it took a team of people a long time to tease apart the details and hammer the process and documents into shape to the satisfaction of the project’s advisers. I am instead providing a brief outline of how the process works, particularly the V&V part. The accreditation part is merely a wrapper around the V&V process that says what the review process is going to be in the beginning and whether the V&V was carried out correctly and shows whether accreditation is supported in the end. I’m convinced that the framework is effective but I’m not convinced that there is any one way to go about it or that every jot and tittle of this formal specification needs to be followed. Plenty of software systems are successfully verified, validated, deployed, and used every day and they clearly don’t all used this method. Take the best features of every framework you find, including this one.

The basic framework is this:

  1. Write an Accreditation Plan to describe the steps that will be taken to support an accreditation of the model.
  2. Write a V&V Plan to describe the steps that will be taken to conduct the Verification and Validation of the model.
  3. Perform the Verification and Validation steps.
  4. Write a V&V Report describing the results of the V&V process. This includes a V&V Recommendation and a description of Lessons Learned.
  5. Write an Accreditation Report describing the results of the Accreditation analysis (which itself is a review of the V&V Report). This includes an Accreditation Recommendation and a description of Lessons Learned.
  6. Make an Accreditation Decision that accepts or fails to accept the system for the intended use, or accepts the system with limitations.

This process assumes that a host of artifacts have been produced and can be reviewed including:

  • Intended Use Statement
  • Conceptual Model (Description of System to be Simulated in the defined framework, but this can be generalized to be the description of any as-is system or process that is going to be addressed by a new implementation)
  • Statement of Requirements and Acceptability Criteria
  • Statement of Assumptions, Capabilities, Limitations, and Risks and Impacts
  • Input and Output Data Artifacts
  • Design Document
  • System Implementation
  • Configuration Management History
  • Test History and Results
  • Customer/SME Assessment

The following evaluations are carried out as part of the V&V process. These steps are meant to assess an implementation’s credibility in terms of capability, accuracy, and usability. Verify that:

  • Requirements map to Specific Intended Use Statement (Requirements Traceability Matrix – RTM)
  • Requirements map to Conceptual Model and vice versa (Requirements to Conceptual Model Map – RCMM)
  • Implementation items map to Requirements
  • Implementation items make logical sense
  • Sources of data are authoritative
  • Data are correct (input and output)
  • Data are correctly formatted (input and output)
  • Data can be traced through processing
  • User Interface items support all required behaviors
  • Outputs conform to specification
  • Outputs are accepted as authoritative
  • User operations and transformations are accepted as logical and appropriate
  • Full configuration management history is available
  • Functional test results are available and show all problems corrected
  • Quantitative test results are available and show all problems corrected
  • System operates correctly in target environment
  • All documentation items are complete and accepted

The VV&A process can be applied to an existing system but it’s usually better if the process is in place from the beginning of an implementation so the development team(s) can work with the VV&A team(s) with the review framework in mind. It’s an effective form of project governance and quality assurance.

This table is from the Naval specification we worked from.

This paper provides excellent background and insight into how and why this methodology was developed. As of this writing the full pdf could be accessed in the Google cache, but doing so may be problematic.

This entry was posted in Software, Tools and methods and tagged , , , . Bookmark the permalink.

Leave a Reply