GAO says two services lack an overall strategy for simulator training.

Patience Wait, Contributor

August 26, 2013

4 Min Read

5 Helpful Online Services From Uncle Sam

5 Helpful Online Services From Uncle Sam


5 Helpful Online Services From Uncle Sam (click image for larger view)

Computerized simulators and simulation exercises have long been an essential part of military training. In a new report, however, the Government Accountability Office suggests the Army and Marine Corps would benefit from developing better performance metrics to determine the proper mix of live training and simulation exercises. And one place to start, the GAO suggested, is to look at the work done by the Navy.

The report, "Army and Marine Corps Training: Better Performance and Cost Data Needed to More Fully Assess Simulation-Based Efforts", found that "[n]either service has taken steps to identify performance metrics and the type of performance data that would be needed to evaluate how the use of simulation-based training devices contributes to training effectiveness."

Simulators have been used for several decades in aviation training, both commercial and military. The technology wasn't applied to ground training until the 1980s, the GAO said, when the Army began simulating weapons fire. Since then, simulators and simulations have expanded into many other types of training exercises, and the technology has evolved into what GAO described as virtual (of a setting, environment, device, etc.), constructive (computer-based simulations), and gaming training.

Because there is some overlap in the kinds of missions the two services are tasked with, the Marine Corps and Army have collaborated on developing some training simulators, according to the GAO, which has resulted in efficiencies.

[ Navy's new mobile app serves as source of information for sailors as well as the general public. Read Navy Demos Mobile Information App. ]

For instance, the Army uses the Homestation Instrumentation Training System (HITS) to support collective maneuver training. When the Marine Corps began to develop something similar, it evaluated HITS and found it could reuse 87% of the Army system's components. Marine Corps officials told GAO that this led to the Marine Corps system being developed in two years rather than the nine years originally projected and saved approximately $11 million in costs. The two services' training material developers, co-located in Orlando, Fla., have memoranda of understanding "intended to promote coordination and encourage maximum reusability of existing devices," GAO stated.

Another thing the two services have in common is the lack of an overall strategy for simulator training. GAO cited the Navy's establishment of its "Overarching Fleet Training Simulator Strategy," which set specific guidance for simulator use, as a more forward-looking approach. The Navy strategy also lists a dozen investment priorities, including investing in simulators and simulations that have the greatest potential to generate cost savings, and it assigns responsibility for developing a methodology for tracking return on simulator investments, according to the GAO.

As simulation has become a more pervasive training tool, the development and use of performance metrics would, in the current fiscal environment, help the two services evaluate which kinds of training can be delivered effectively by simulations rather than live exercises, and help the services make the most of tight budgetary resources. This is the first of GAO's recommendations.

Conversely, the two services' assumption that simulation is generally less expensive than live training cannot be properly measured without metrics to show its effectiveness. But according to GAO, neither service has established a methodology to identify and compare the costs associated with live and simulation-based training. The watchdog agency recommended the Army and Marine Corps both undertake developing the tools to evaluate the two training methods side by side.

The Defense Department partially concurred with both recommendations. It agreed that performance metrics would help in determining the right mix of live and simulation training, but claimed the "magnitude and scope" of the training environment, number of personnel, and "ever-changing technology" requires a lot of independent variables.

As for better identifying the costs of live vs. simulation training, the Defense Department responded that it already captures all relevant costs needed for decision making, but it agreed that having more comprehensive information would help in determining the optimal mix between the two types of training.

About the Author(s)

Patience Wait

Contributor

Washington-based Patience Wait contributes articles about government IT to InformationWeek.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights