Effective Measures of Training Display System Performance
Navy SBIR 2014.2 - Topic N142-104
NAVAIR - Ms. Donna Moore - navair.sbir@navy.mil
Opens: May 23, 2014 - Closes: June 25, 2014

N142-104 TITLE: Effective Measures of Training Display System Performance

TECHNOLOGY AREAS: Air Platform, Human Systems


OBJECTIVE: Develop an objective and efficient measurement toolkit for conducting validated acceptance tests for simulation training display systems.

DESCRIPTION: Over the past decade, the ability to measure essential display system attributes has advanced to the point where the simulation training industry has multiple suppliers who offer automated display calibration systems, capable of accurate geometry and channel-to-channel co-alignment. Several of these systems can also measure and correct edge blending, electro-optical response (gamma), color, and uniformity. Recent publications describe reasonable new metrics for mirror distortions, motion-induced blurring, uniformity, and Night Vision Goggles (NVG) stimulation capability. Despite these significant advances, the typical acceptance tests used to certify training display systems are still manual, incomplete, time consuming, and often inconsistently applied across programs.

Innovative metrics and measurement procedures are needed to support the certification of simulation training devices in the field. Respondents are expected to focus their initial efforts on those metrics that are expected to correlate most strongly with training task performance and provide the greatest return on investment. Candidate metrics include: system resolution, sampling artifacts, update rate, motion-induced blur, contrast, luminance, geometry, mirror distortion, electro-optical response (gamma), color, NVG stimulation (infrared (IR) balance), and uniformity. Where practical, the metrics and procedures should correspond with those currently in use in the simulation training industry. Correlation with training effectiveness or task performance must be demonstrated for metrics that deviate from standard practices.

The proposed system must be capable of being rapidly set up (e.g., less than two hours) at the real eyepoints within a display system, without having to significantly reconfigure the device (i.e., remove cockpit seats). The measurement system must be capable of dealing with the challenges associated with window bars and other obstructions that may interfere with measuring the entire field of view of the display system. To assure applicability across a wide range of display system suppliers, the measurement system may not use sensors or components embedded within the display system. Where practical, the measurement system should support factory acceptance testing for major system components, such as display screens, mirrors, projectors, and image generators (anti-aliasing), prior to integration into the training device. Ideally, candidate test patterns will be defined in a way that allows for their installation or implementation on any of the image generators commonly used for simulation training. The measurement system should require no more than a total of 8 hours of training to become proficient, and should contain embedded help functions and documentation. An effort should be made to minimize the number, cost, and complexity of the components in the measurement tool kit, and a practical shipping container(s) for everything required for a site visit should be included.

PHASE I: Develop and prove feasibility of a concept that includes detailed descriptions of the system components and draft procedures for the on-site measurement and computation of metrics. Identify at least five of the candidate metrics listed above and conceptually demonstrate feasibility of the implemented system.

PHASE II: Design, construct, test, and refine a complete working prototype of the measurement system that is capable of measuring and computing at least five of the candidate metrics. Demonstrate the abilities of the system in two separate simulation training facilities and in both rotary and fixed wing training devices; one training simulator should have a real display and one should have a collimated display. Demonstrate the developed systemís ability to measure and compute at least four of the candidate measurements. Conduct validation studies for any new metrics that are proposed as predictors of system performance.

PHASE III: Fully productize and transition the display measurement toolkit to the modeling and simulation community; develop and provide a one-day training course designed for the simulation certification personnel who will use the kit.

PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: This has the potential for several commercial products. The toolkit can eventually be manufactured, calibrated, and supported by various equipment vendors, particularly those dealing with measuring instruments. The toolkit should have great flexibility across various systems.

1. Desjardins, D. D., & Meyer, F. M., 2012, Military Display Performance Parameters, Paper presented at the SPIE Defense, Security, and Sensing.

2. Ewart, R. B., & Harshbarger, J. H., 1975, Measurement of Flight Simulator Visual System Performance. Paper presented at the SPIE Simulators and Simulation.

3. Keller, P. A., 1997, Electronic Display Measurement: Concepts, Techniques, and Instrumentation, New York: Wiley.

4. Lloyd, C. J., 2012, Effects of Spatial Resolution and Antialiasing on Stereoacuity and Comfort. Proceedings of the AIAA Modeling and Simulation Technologies Conference, Minneapolis, MN.

5. Lloyd, C. J., Williams L., Pierce B., 2011, A Model of the Relative Effects of Key Task and Display Design Parameters on Training Task Performance, Proceedings of the IMAGE Society Annual Conference, Scottsdale, AZ.

6. Lloyd, C. J., Nigus, S., Ford, B. K., Linn T., 2010, Proposed Method of Measuring Display Systems for Training with Stimulated Night Vision Goggles. Proceedings of the IMAGE Society Annual Conference, Scottsdale, AZ.

7. Long, J.L., Lloyd, C. J., Beane, D.A., 2010, Practical Geometry Alignment Challenges in Flight Simulation Display Systems, Proceedings of the IMAGE Society Annual Conference, Scottsdale, AZ.

8. Lloyd, C. J., 2002, Quantifying Edge-Blended Display Quality: Correlation with Observer Judgments, Proceedings of the IMAGE Society Annual Conference, Scottsdale, AZ.

9. SID, 2012, Information Display Measurements Standard: IDMS Version 1.03: Society for Information Display.

KEYWORDS: Validation; Effective Measures; Training Display; Automated Testing; Display System Calibration; Measurement System

DoD Notice:  
Between April 23 through May 22 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. Their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting May 23, 2014, when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (14.2 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 14.2 topic under which they are proposing.

If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at (866) 724-7457 or webmail link.