TITLE: Pattern Recognition
Algorithms for Detection of Latent Errors in Combat System Software
Battlespace, Electronics, Sensors
ACQUISITION PROGRAM: Program
Executive Office Integrated Warfare System (PEO IWS) 1.0 – AEGIS Combat System
The technology within this
topic is restricted under the International Traffic in Arms Regulation (ITAR),
22 CFR Parts 120-130, which controls the export and import of defense-related
material and services, including export of sensitive technical data, or the
Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls
dual use items. Offerors must disclose any proposed use of foreign nationals
(FNs), their country(ies) of origin, the type of visa or work permit possessed,
and the statement of work (SOW) tasks intended for accomplishment by the FN(s)
in accordance with section 5.4.c.(8) of the Announcement. Offerors are advised
foreign nationals proposed to perform on this topic may be restricted due to
the technical data under US Export Control Laws.
OBJECTIVE: Develop pattern
recognition algorithms that identify and characterize latent errors in the
software code of the AEGIS operational software prior to deployment.
DESCRIPTION: The software for
the AEGIS system is a critical component of ship and strike group
self-defense. Therefore, software quality is of utmost importance. Any
software defects (bugs) present in the AEGIS software can have mission-critical
impacts on the defense of the Navy’s Fleet assets. In order to field the best software
to the Fleet, AEGIS software must undergo thorough testing. Throughout the
software development process, AEGIS undergoes hours of testing, producing
terabytes of data. The testing is accomplished with two AEGIS labs and a
modeling and simulation suite concurrently providing Combat System data. The
current process of debugging includes the following steps: find the bug through
testing, conduct additional testing to determine the bug’s priority (mission
criticality and impact) and probability (chance of occurrence), root cause the
bug to specific code areas, and fix the bug. After this process is completed,
it is repeated to ensure elimination of the bug and determine if there are any
changes that might create another bug. These processes are necessary because a
missed bug that is fielded may degrade a ship’s combat capability or create an
unintended loss of life or dangerous operational situations. While the
commercial industry has the ability to field less than perfect software and
simply respond to user complaints on functionality through rapidly deployable
upgrades, military software such as AEGIS must have a much higher fidelity in
functionality to avoid unintended impacts on human life.
Similar to current commercial debugging processes, the current AEGIS testing
process provides terabytes of data that can lead to possible issues, but AEGIS
data is not currently analyzed unless there is a visible error onboard the
ship. Unfortunately, the manual analysis by humans of all of the AEGIS data generated
through testing is not cost-effective, which drives the need for
machine-learning algorithms to learn system behavior and identify
out-of-pattern behaviors. Additionally, unlike commercial software upgrades,
AEGIS upgrades go through longer approval and certification timelines before
they can be released. Subsequently, reducing these timelines through automated
data analysis can significantly impact both cost and performance of the AEGIS
Combat System (ACS).
The Navy seeks a technology that provides the capability to automatically find
and characterize bugs. This automation capability will analyze recorded data
from AEGIS to find latent errors and provide data-driven priority and
probability prior to AEGIS certification and deployment. This technology will
enable high priority bug detection and repairs before fielding the system and
enable the fielding of a combat system at full capability. As a result,
development and maintenance costs of the AEGIS software will be reduced.
Latent error detection will ensure that the best quality software is fielded to
the Warfighter and works every time. Testing can be very expensive; therefore,
the better the Navy becomes at finding and fixing software bugs, the less
testing will be required. This will result in a more capable upgrade, faster
deployment, and cost savings.
The software solution will analyze 300 terabytes of AEGIS data throughout the
development lifecycle of a baseline. In doing so, the software solution will
use big data and machine learning algorithms and technology to characterize the
patterns of AEGIS system behavior and identify out-of-process behaviors that
lead to system failure. Through analysis of large amounts of AEGIS data, the
software solution will provide large-scale analysis of AEGIS software data that
encompasses all testing for the baseline. This will help AEGIS find and fix
high priority bugs via (1) finding bugs that have been overlooked by systems
analysts and (2) providing better data on the probabilities and impacts of bugs.
This will be measured by identifying the number of defects found by the
technology and comparing that number to the number of defects found by the
Government analyst team through traditional methods. The goal is for the
software solution to increase the number of high-priority bugs found by a
minimum of 10%. The software solution must also be able to identify whether
defects have been fixed over the lifetime of the AEGIS software development,
and are no longer issues in the most recent builds.
The Phase II effort will likely require secure access, and NAVSEA will process
the DD254 to support the contractor for personnel and facility certification
for secure access. The Phase I effort will not require access to classified
information. If need be, data of the same level of complexity as secured data
will be provided to support Phase I work. Phase II and Phase III will include
work with the AEGIS Prime contractor, Lockheed Martin.
Work produced in Phase II may become classified. Note: The prospective contractor(s)
must be U.S. Owned and Operated with no Foreign Influence as defined by DOD
5220.22-M, National Industrial Security Program Operating Manual, unless
acceptable mitigating procedures can and have been implemented and approved by
the Defense Security Service (DSS). The selected contractor and/or
subcontractor must be able to acquire and maintain a secret level facility and
Personnel Security Clearances, in order to perform on advanced phases of this
contract as set forth by DSS and NAVSEA in order to gain access to classified
information pertaining to the national defense of the United States and its
allies; this will be an inherent requirement. The selected company will be
required to safeguard classified material IAW DoD 5220.22-M during the advance
phases of this contract.
PHASE I: Develop a concept
for pattern recognition algorithms that automatically reveal bugs in software.
Determine feasibility through modeling and analysis of data sets and show it
meets the requirements as described in the description. The concept will show
it can feasibly analyze outputs of AEGIS software data extraction and find
latent errors that may contribute to mission failure. The AEGIS Program Office
will provide sample unclassified data for the topic. The Phase I Option, if
awarded, will include the initial design specifications and capabilities
description to build a prototype in Phase II. Develop a Phase II plan.
PHASE II: Based upon the
results of Phase I and the Phase II Statement of Work (SOW), design, develop,
and deliver a prototype for pattern recognition algorithms that automatically
finds and characterizes bugs in software. The prototype will provide analysis
tools that will work with AEGIS software and demonstrate it effectively finds
latent software errors and characterizes the priority and probability of those
errors. The demonstration will take place at a Government- or company-provided
facility. The company will prepare a Phase III development plan to transition
the technology for Navy production and potential commercial use.
It is probable that the work under this effort will be classified under Phase
II (see Description section for details).
PHASE III DUAL USE
APPLICATIONS: Support PEO IWS 1.0 in transitioning the technology for Navy use
to allow for further development, refinement, and testing. The implementation
will be a fully functional software tool that provides continuous analysis of
test data to ensure AEGIS works the first time and every time. Integration will
be through the development cycle for AEGIS at Lockheed Martin and Government
Any software development company, including Apple, Microsoft, Google, etc., has
software bugs. These companies look to improve their software development
process to drive down costs and create software in the cheapest and most
effective ways possible. Similar to AEGIS, these companies all track their
bugs, and have comprehensive test plans that work through system capability to
define bugs. Therefore, there is significant commercial application for the
technology detailed in this topic.
1. Jones, Capers. “Software
Defect Origins and Removal Methods.” International Function Point Users Group,
2. Rahman, Aedah Abd and
Hasim, Nurdatillah. “Defect Management Life Cycle Process for Software Quality
Improvement.” 2015 3rd international Conference on Artificial Intelligence,
Modelling and Simulation (AIMS), 2-4 December 2015. http://ieeexplore.ieee.org/document/7604582
3. Mchale, John. “The Aegis
Combat System’s continuous modernization.” Military Embedded Systems, 26 August
KEYWORDS: Software Quality;
AEGIS Software; Testing for Software Bugs; Automation of Software Review;
Software Defect; Big Data Processing; Machine Learning
** TOPIC NOTICE **
These Navy Topics are part of the overall DoD 2018.1 SBIR BAA. The DoD issued its 2018.1 BAA SBIR pre-release on November 29, 2017, which opens to receive proposals on January 8, 2018, and closes February 7, 2018 at 8:00 PM ET.
Between November 29, 2017 and January 7, 2018 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. During these dates, their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting January 8, 2018 when DoD begins accepting proposals for this BAA.
However, until January 24, 2018, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS during the Open BAA period for questions and answers and other significant information relevant to their SBIR/STTR topics of interest.
Topics Search Engine: Visit the DoD Topic Search Tool at sbir.defensebusiness.org/topics/ to find topics by keyword across all DoD Components participating in this BAA.
Proposal Submission: All SBIR/STTR Proposals must be submitted electronically through the DoD SBIR/STTR Electronic Submission Website, as described in the Proposal Preparation and Submission of Proposal sections of the program Announcement.
Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 800-348-0787 or via email at email@example.com