Automated Analysis of Combat Systems Softwaret
Navy SBIR 2015.1 - Topic N151-051
NAVSEA - Mr. Dean Putnam - dean.r.putnam@navy.mil
Opens: January 15, 2015 - Closes: February 25, 2015 6:00am ET

N151-051 TITLE: Automated Analysis of Combat Systems Software

TECHNOLOGY AREAS: Sensors, Electronics, Battlespace

ACQUISITION PROGRAM: PEO IWS 1.0, Integrated Combat Systems, AEGIS

OBJECTIVE: Develop an automated analysis tool for combat systems software system change request databases.

DESCRIPTION: Modern naval combat systems are continually being operated, upgraded, and tested at-sea and ashore. At any given time, there are many requests to de-bug, change, or update the underlying Computer Programs (CP). These requests are called Change Requests (CR) and are tracked in combat system databases. The CRs are reviewed and evaluated by a board of experts to determine the relative necessity and priority of implementing these changes. This process involves classifying the type of change, assigning a relative priority to the change, and determining the impact to other systems if the change is implemented. It requires experts from different engineering communities possessing various knowledge backgrounds to understand the complexity, priority, and need of each CR. After various reviews by the board, the resulting prioritized lists often reflect subjective rankings rather than an objective engineered recommendation to implement certain CPCRs [Ref 1].

The Navy needs a way to objectively prioritize CRs for implementation into the combat system. This topic seeks an automated tool to cull through hundreds of candidate engineering changes to recommend which should be selected to invest in and incorporate into revisions of the Aegis system. The current method is subjective and manpower-intensive. It is fraught with uncertainty and prone to improper classification and overlooking high priority needs. The Navy has been unsuccessful in finding a solution to review, classify, and prioritize the CRs in a combat system database. A tool is needed that will replace the engineering experts, and has the capability to determine the relative importance and priority of all the CRs found within a combat system’s database.

The Navy is interested in exploring the field of knowledge engineering management as a mechanism for the design of a decision tool. Using an artificial intelligence (IA) system which can learn to replicate the "expert panels" decisions of the past and then see if the IA software is "teachable" for the future would effectively automate the process and provide a knowledge database. Modern models, such as Model-based and Incremental Knowledge Engineering (MIKE) and heuristic classification, [Ref 2] are commercially available and could be considered for use in developing a tool for screening CRs to assign a relative rank for each. The tool will need to evaluate a number of disparate criteria, including (1) the estimated amount of code de-bugging work required to implement the CR, (2) the reduction in watchstander workload resulting from CR implementation, and (3) the mission area impact if the CR is not implemented.

PHASE I: The company will define and develop a concept for an automated analysis tool that meet the requirements stated in the topic description. The company will demonstrate the feasibility of the concept in meeting Navy needs and will establish that the concept can be developed into a useful product for the Navy. Feasibility will be established by testing and analytical modeling.

PHASE II: Based on the results of Phase I, the small business will develop a prototype analysis tool for evaluation. The prototype will be evaluated to determine its capability in meeting Navy requirements for the automated analysis tool. System performance will be demonstrated through prototype evaluation and modeling or analytical methods. Evaluation results will be used to refine the prototype into a design that will meet Navy requirements. The company will prepare a Phase III development plan to transition the technology to Navy use.

PHASE III: The company will be expected to support the Navy in transitioning the automated analysis tool technology for Navy use. The company will develop the automated analysis tool according to the Phase III development plan for evaluation to determine its effectiveness in an operationally relevant environment. The company will support the Navy for test and validation to certify and qualify the system for Navy use.

PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: Commercial software developers routinely test and upgrade their computer programs. How these companies determine which fixes to implement is normally proprietary information. However, a successful CR evaluation tool could be readily adopted in the commercial sector to prioritize fixes that have the greatest impact for their customers.

REFERENCES:
1. McConnell, David E., Sperry, Charles H. "AEGIS Software Engineering Process Document." Defense Technical Information Center. 27 March 1995: NSWC Dahlgren Division. Accessed 4 April 2013. www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA295624

2. Studer, Rudi; Benjamins, V. Richard; Fensel, Dieter. "Knowledge Engineering: Principles and Methods." CiteSeerX. 21 November 1997: University of Karlsruhe. Accessed 29 March 2013.
http://www.it.iitb.ac.in/~palwencha/ES/Knowledge%20engineering%20-%20Principles%20and%20methods.pdf

KEYWORDS: Computer program change requests (CPCR); knowledge engineering management; watchstander workload; combat system database; Model-based and Incremental Knowledge Engineering (MIKE); heuristic classification

** TOPIC AUTHOR (TPOC) **
DoD Notice:  
Between December 12, 2014 and January 14, 2015 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting January 15, 2015 , when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (15.1 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 15.1 topic under which they are proposing.

If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at (866) 724-7457 or webmail link.

Return

Offical DoD SBIR FY-2015.1 Solicitation Site:
www.acq.osd.mil/osbp/sbir/solicitations/sbir20151/index.shtml