Data Science Driven Aircrew Performance Measurement and Proficiency System
Navy SBIR 2018.1 - Topic N181-026
NAVAIR - Ms. Donna Attick -
Opens: January 8, 2018 - Closes: February 7, 2018 (8:00 PM ET)


TITLE: Data Science Driven Aircrew Performance Measurement and Proficiency System


TECHNOLOGY AREA(S): Air Platform, Human Systems, Weapons

ACQUISITION PROGRAM: PMA 205 Naval Aviation Training Systems

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Develop a software technology to pre-process, fuse, and store data from multiple sources for human performance assessment and proficiency tracking during training, with the capability to parse and synchronize disparate data from live, virtual, and constructive (LVC) aviation training system sources such as range instrumentation, aircraft systems, virtual simulators, and constructive applications to output automated performance metrics. Develop a human-machine interface that provides visualization tools that facilitate data synthesis by human-in-the-loop users.

DESCRIPTION: Navy leadership has issued guidance to move from reactive decisions to proactive or predictive solutions leveraging data-driven analytics to aid in decision-making and proficiency tracking.  Agreement across the Department of Defense for quantitative, data-driven decisions is an important first step; however, implementing systems capable of collecting, storing, fusing, analyzing, interpreting, and safeguarding that information is a difficult challenge.  Leveraging advances in data science for training performance assessment is a critical domain where technology provides a means to increase accuracy and reduce workload.  Instructors do not currently have enough time for a rigorous and detailed performance evaluation of each flight.  Research has clearly demonstrated that high workload has the potential to negatively affect the accuracy and effectiveness of subjective performance ratings and the subsequent feedback provided to trainees [Ref 2], thereby reducing the quality and quantity of training data that feeds back to decision-makers within the Naval Aviation Enterprise (NAE).

The current state-of-the-practice for performance assessment relies heavily on subjective rating, which is hampered by a manually intensive and time-consuming process.  A software tool that provides an automated mechanism to pre-process and fuse multiple data sources for human performance assessment and proficiency tracking in warfighting capabilities would alleviate this burden.  Specifically, development of automated computational methods can assist with timely and continuous calculation of aircrew performance, proficiency and identify associated trends.  Technical objectives include the design and development of a capability that provides:
1) Data interfaces for consumption and processing of a range of disparate data sources used in LVC training system sources such as range instrumentation, aircraft systems (e.g., aircraft flight logs, radar, weapons, communication, imagery), virtual simulators (e.g., High Level Architecture, mission computer, instructor stations, range systems, acoustic processors), and constructive applications (e.g., semi-automated forces, system emulators);
2) An architecture and process for linking available data sources to tactical aircrew performance for data synthesis to inform performance assessments (semi-automated) and/or calculate automated performance metrics;
3) Scalable functionality to support individual, team, and multi-team aircrew compositions and mission sets; and
4) An intuitive human-machine interface that provides visualization tools to facilitate data synthesis by human-in-the-loop users and display automated data outputs.

The final product will enable after-action performance reviews and debriefs of training events to include full visibility into the details of effects chain execution in order to identify errors in mission execution at all levels, while reducing the time required for after-action performance reviews based on present-day training.

The solution should include hardware only if existing systems are inadequate for the task, as there is a desire to avoid the need for additional hardware at fleet training sites.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by DoD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this project as set forth by DSS and NAVAIR in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advanced phases of this contract.

PHASE I: Design an architecture and process for linking available data sources to tactical aircrew performance in warfighting capabilities based on fleet tactical recommendations (i.e., Tactics, Techniques, and Procedures (TTP)) and mission-essential tasks references (e.g., Wing Training Manuals, Training & Readiness Matrices), that is flexible to incorporate future tactics and scalable to address individual to multi-team performance.  Demonstrate the feasibility of implementing a software-based solution to process, parse, and fuse disparate data sources and types (e.g., aircraft data, sensor data, simulator data, video files, range instrumentation data, and voice communication recordings) for a single platform.  Design advanced data science approaches (e.g., machine learning, artificial intelligence, voice recognition, image processing) for automated and human-in-the-loop data output for performance assessment, facilitating feedback, and support longitudinal trend analysis computations.  Risk Management Framework guidelines should be considered and adhered to during the development to support information assurance compliance.  Phase I should include development of prototype plans for Phase II.

PHASE II: Develop and demonstrate a prototype software solution based on the designed architecture and a process that fuses multiple data sources and types, and outputs automated and human-in-the-loop data output for performance assessment, facilitates feedback, and supports longitudinal trend analysis computations.  Evaluate the efficiencies and return on investment gains associated with semi-automated and/or automated data processing.  Demonstrate software scalability to multiple missions and/or multiple platforms.  Develop and evaluate the usability of a human-machine interface that provides visualization tools to facilitate data synthesis by human-in-the-loop users and displays automated data outputs.  Risk Management Framework guidelines should be considered and adhered to during the development to support information assurance compliance.

It is probable that the work under this effort will be classified under Phase II (see Description section for details).

PHASE III DUAL USE APPLICATIONS: Conduct the testing and integration necessary to support transition to a fleet training site.  Implement any outstanding Risk Management Framework guidelines to ensure information assurance compliance; complete the process to seek a standalone Authority To Operate (ATO) and/or support a transition training site to incorporate the developed training solution into an existing ATO depending on transition customer’s desire.  Continue development to expand the architecture to new data sources and future references sources on aircrew performance and/or software scalability to multiple missions and/or multiple platforms.  Improvements in technology to collect detailed performance on operators are applicable to all military and commercial systems where operator reliability is critical to mission success.  Successful technology development would be applicable to most military systems where it would be possible to take advantage of the large quantities of data being produced in training events and efficiently processing that data into meaningful performance metrics. Similar applications would be useful in commercial aviation, space, and maritime industries.


1. Ault, F. “Report of the Air-to-Air Missile System Capability Review.” July-November 1968.

2. Bretz, R. D., Milkovich, G. T. & Read, W. “The current state of performance appraisal research and practice: Concerns, directions, and implications.” Journal of Management, 1992, 18(2), 321-352.

3. Brobst, W. D. Thompson, K. L. & Brown, A. C. “Air Wing Training Study: Modeling Aircrew Training for Acquiring and Maintaining Tactical Proficiency.” A Synthesis of CBA’s Work, October 2006.

4. Fan, Jianqing, Han, Fang, and Liu, Han. “Challenges of Big Data Analysis.” National Science Review, Volume 1, Issue 2, 1 June 2014, pp 293–314.

5. Griffin, G.R. & Shull, R.N. “Predicting F/A-18 Fleet Replacement Squadron Performance Using an Automated Battery of Performance-Based Tests.” Naval Aerospace Medical Research Laboratory, Naval Air Station, Pensacola, Florida, July 1990.

6. Horowitz, Stanley A., Hammon, Colin P. & Palmer, Paul R. “Relating Flying-Hour Activity to the Performance of Aircrews.” Institute for Defense Analyses, Alexandria, Virginia, December 1987.

7. Kahneman, D. (1973). Attention and effort (p. 246). Englewood Cliffs, NJ: Prentice-Hall.

8. Ellett, Jennifer M. and Khalfan, Shaun. “The Transition Begins: DoD Risk Management Framework: An Overview.” CHIPS: The Department of the Navy’s Information Technology Magazine, April-June 2014.

KEYWORDS: Proficiency; Performance Assessment; Aircrew; Human Factors; Training; Debrief


These Navy Topics are part of the overall DoD 2018.1 SBIR BAA. The DoD issued its 2018.1 BAA SBIR pre-release on November 29, 2017, which opens to receive proposals on January 8, 2018, and closes February 7, 2018 at 8:00 PM ET.

Between November 29, 2017 and January 7, 2018 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. During these dates, their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting January 8, 2018
when DoD begins accepting proposals for this BAA.
However, until January 24, 2018, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS during the Open BAA period for questions and answers and other significant information relevant to their SBIR/STTR topics of interest.

Topics Search Engine: Visit the DoD Topic Search Tool at to find topics by keyword across all DoD Components participating in this BAA.

Proposal Submission: All SBIR/STTR Proposals must be submitted electronically through the DoD SBIR/STTR Electronic Submission Website, as described in the Proposal Preparation and Submission of Proposal sections of the program Announcement.

Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 800-348-0787 or via email at