Evaluation Framework for Assessing Validation Methods on Modeling and Simulation Models
Abstract
Modeling and simulation (M&S) is a critical step throughout the systems engineering process for developing and fielding a combat system. Verification and, more specifically, validation are essential to determining whether a simulation is credible and reliable. Although policy and guidance increasingly emphasizes the importance of rigorous validation founded in the application of strong statistical analysis, implementation continues to be challenging. As a result, test organizations and statisticians have been interested in developing a robust approach for measuring the performance of the validation methods used to assess model accuracy. The Johns Hopkins University Applied Physics Laboratory (APL) developed a flexible and extensible framework to evaluate the performance of the validation methods. The framework provides the modularity to evaluate multiple validation methods and is sufficiently generic to support assessment of multiple simulation models. This article details the framework design and the analysis of multiple statistical validation methods, including an exemplar assessment of the methods applied for a recently accredited missile system simulation.