Repurposing Computational Analyses of Tactics for Training Assessments
Navy STTR 2018.A - Topic N18A-T003
NAVAIR - Ms. Donna Attick -
Opens: January 8, 2018 - Closes: February 7, 2018 (8:00 PM ET)


TITLE: Repurposing Computational Analyses of Tactics for Training Assessments


TECHNOLOGY AREA(S): Air Platform, Human Systems, Information Systems

ACQUISITION PROGRAM: PMA-276 H-1 USMC Light/Attack Helicopters

OBJECTIVE: Design and develop a software technology that leverages data science and advanced computational analyses of tactical data sources to improve training scenarios and assessments and make training more adaptive, efficient, and effective.

DESCRIPTION: Emerging warfare capabilities offer a great many new tactical options to commanders.  However, this also increases the demands on decision-makers during operations.  The dynamic and complex nature of integrated warfare results in training challenges to prepare for those engagements.  As the complexity of Tactics, Techniques, and Procedures (TTPs) increase, testing in part via computational simulation and optimization is necessary.  Such analyses systematically vary tactical applications of the warfare capability to a variety of threat scenarios, simulate and score each encounter, and generate a ranked list of the most successful tactics per threat.  The scenarios, measures, and knowledge generated in this type of work are rich and voluminous, providing opportunities to leverage data science.

A software technology solution able to re-use analytic data outputs (e.g., mission analysis, TTP analysis, modeling and simulation testing, aircraft system data logs) for populating training content is desired.  Specifically, two means to re-use data are sought: 1) the capability to generate scenario libraries, and 2) the ability to improve integrated assessments of human tactical skills to make training more efficient and effective.  Technical approach and underlying data science methods integrated into a software solution should demonstrate a means to output instructionally sound scenario that require minimal human-in-the-loop interaction while reducing the time to prepare training scenarios through automation when compared to hand coding initial conditions for semi-automated forces.  Further, the software solution should make recommendations for training objectives and automate development of performance measures that complement training scenario outputs to ensure that scenarios train desired skills and provide a means to assess learners.

PHASE I: Design methods and determine the feasibility of a software that can repurpose the output of data analyses (e.g., mission analysis, TTP analysis, modeling and simulation testing, aircraft system data logs) to generate recommendations for tactical training scenarios and assessments in complex warfare capabilities.  Demonstrate the feasibility of data science approaches for use in a software technology solution.  Risk Management Framework guidelines should be considered and adhered to during the development to support information assurance compliance.  In preparation for human subjects’ experiments in Phase II, research protocols and Institutional Review Board (IRB) applications should be developed and submitted.  The Phase I effort will include prototype plans to be developed under Phase II.

PHASE II: Develop prototype software technology that leverages data science approaches to repurpose the output of data analyses to support tactical training scenario and assessment (i.e., performance measures) generation in complex warfare capabilities.  Conduct human factors analyses to ensure the usability of the prototype software.  Conduct human subjects’ experiments that validate training effects and benefits of auto-generated scenario and performance measurement outputs of the software technology for a single-use case.  Risk Management Framework guidelines should be considered and adhered to during the development to support information assurance compliance.

PHASE III DUAL USE APPLICATIONS: Expand the development of the software technology to additional use cases and aviation platforms.  Demonstrate the reliability and validity of system outputs for effective tactical training scenario and assessment (i.e., performance measures) generation in complex warfare capabilities.  Complete the process to seek a standalone Authority To Operate (ATO) and/or support a transition training site to incorporate the developed training solution into an existing ATO depending on transition customer’s desire.  Conduct test and integration activities with target transition data analysis outputs and training system inputs.  Improvements in technology to repurpose data analysis outputs is applicable to all military and commercial systems where system generated logs (e.g., commercial aviation) are collected.  Further, technology developed in this STTR topic would be applicable to most military systems where data is output in one stage of the acquisition process (e.g., modeling and simulation testing) to increase re-use for reduction of resources and/or schedule in later stages.  In the training environment, this type of technology also provides an opportunity to increase the effectiveness and fidelity of training scenarios while increasing instructional capabilities through relevant performance assessment tools.


1. Kitchin, R. "Big Data, new epistemologies and paradigm shifts." Big Data & Society, April-June 2014, I-12.

2. Fan, J., Han, F., and Liu, H. "Challenges of Big Data Analysis." (Published 05 February 2014) National Science Review, Volume 1, Issue 2, 1 June 2014, Pages 293-314.

3. "Top 50 Big Data Platforms and Big Data Analytics Software." Data Science Platform.

4. Labrinidis, A. and Jagadish, H. V. "Challenges and Opportunities with Big Data." Journal Proceedings of the VLDB Endowment, Vol. 5, Issue 12, August 2012, pp 2032-2033.

5. Risk Management Framework (RMF) for DoD Information Technology (IT)F.

6. Risk Management Framework:

KEYWORDS: Data Science; Training; Performance Assessment; Human Factors; Data Analytics; Training Development


These Navy Topics are part of the overall DoD 2018.A STTR BAA. The DoD issued its 2018.A BAA SBIR pre-release on November 29, 2017, which opens to receive proposals on January 8, 2018, and closes February 7, 2018 at 8:00 PM ET.

Between November 29, 2017 and January 7, 2018 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. During these dates, their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting January 8, 2018
when DoD begins accepting proposals for this BAA.
However, until January 24, 2018, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS during the Open BAA period for questions and answers and other significant information relevant to their SBIR/STTR topics of interest.

Topics Search Engine: Visit the DoD Topic Search Tool at to find topics by keyword across all DoD Components participating in this BAA.

Proposal Submission: All SBIR/STTR Proposals must be submitted electronically through the DoD SBIR/STTR Electronic Submission Website, as described in the Proposal Preparation and Submission of Proposal sections of the program Announcement.

Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR/STTR Help Desk at 800-348-0787 or via email at