Attention-Based Vision for Autonomous Vehicles
Navy SBIR 2014.1 - Topic N141-076
ONR - Ms. Lore Anne Ponirakis -
Opens: Dec 20, 2013 - Closes: Jan 22, 2014

N141-076 TITLE: Attention-Based Vision for Autonomous Vehicles

TECHNOLOGY AREAS: Ground/Sea Vehicles

OBJECTIVE: To develop an attention-based vision system that will facilitate a higher-level reasoning in ground autonomous systems by enabling these systems to actively identify, observe, and focus on regions or objects of interest within complex scenes without losing situational awareness.

DESCRIPTION: Autonomous ground systems continue to promise a revolutionary technology that will enable smaller units to expand their area of operation while maintaining the same control over the battlespace by enhancing sustainment, Intelligence, Surveillance and Reconnaissance (ISR), and maneuver for distributed operations. However, in most cases this promise remains largely unfulfilled with unmanned ground systems still relying heavily on human intervention in all but the most routine mission assignments. The biggest impediment to date has been the failure for these systems to actively perceive the world as a series of semantically labeled objects, events, and situations required for the generation and adaption of goals, priorities, and plans. Computational limitations coupled with the sheer complexity of the tactical environment have resulted in highly specialized solutions that are brittle and not scalable from one situation to the next.

Achieving more autonomous capabilities will require the development of a software framework that is able to effectively and efficiently process electro-optical and possibly multi-modal sensor data to identify, observe, and focus on regions or objects of interest within complex scenes. This framework will need to understand the task objectives and the context of situations as they arise to intelligently select what and where to focus its attention without losing situational awareness. It must be able to detect the temporal continuity of perceived objects despite motion of the sensor suite and/or motion of objects in the world, and will need to be able to predict the motion of objects to anticipate where to look and account for occlusion or discontinuities within an image sequence.

PHASE I: Develop a conceptual design for a framework for an attention-based vision system that is capable of efficiently processing multi-modal sensor data to identify, observe, and focus on objects or regions of interest within complex scenes and deliver a report on the desired approach for development in Phase II. The Phase I report will include: development of the reference model architecture and performance specification and prediction for an attention-based vision framework and, the analysis of the potential component technologies to populate this framework using sensor data captured from the ONR 30 autonomous ground vehicle test-beds. Sensor data may include Red-Green-Blue (RGB) and Long wave infrared (LWIR) images and associated stereo disparity as well as automotive Light Detection and Ranging (LIDAR) and Radio Detection and Ranging (RADAR). Phase I option, if awarded, could address preparation for building the prototype in Phase II.

PHASE II: Prototype an attention-based vision framework based on reference model and performance specification defined in Phase I. This prototype framework will be loosely integrated into the ONR 30 Autonomy baseline system architecture to facilitate the rapid assessment of candidate solutions and evaluation of component technologies. At the conclusion of Phase II, the prototype solution will be demonstrated on ONR 30 autonomous ground vehicle test-beds to validate the performance in a relevant environment.

PHASE III: Conduct the system engineering to fully integrate candidate solution with the ONR 30 Autonomy test-bed platforms and associated system architecture (or other equivalent autonomous system) and conduct the optimization of component algorithms. Conduct full system demonstration in a relevant environment and if successful transition to future desired RSJPO Program of Records "Autonomous Mobility Applique System" or "Common Robotics System-Applique (CRS-A)". If S&T needs still exist, they will be considered for transition into the ONR 30 Core Autonomy portfolio for maturation to TRL 6.

PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: Farming and Mining industries continue to advance and leverage unmanned and autonomous systems for recurring tasks in relatively controlled environments. As these capabilities grow, safety will become more of a concern and the ability to identify, observe, and track objects in the environment will reduce risk of injury or accident. The automotive industry continues to significantly reduce cost of sensors through widespread integration onto new vehicles for safety and automation such as rear warning sensors, adaptive cruise control, and automated parking. As the industry moves toward driverless cars in the next 5 years or so, safety and increased ability to sense and perceive the environment through low cost sensors will be of paramount importance.

1. Mishra, A.K.; Aloimonos, Y.; Loong-Fah Cheong; Kassim, A.A., "Active Visual Segmentation," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 34, no. 4, pp. 639, 653, April 2012.

2. Munoz, D., "Inference Machines, Parsing Scenes via Iterated Predictions," CMU-RI-TR-13-XX.

3. Albus, J. S., "A Model of Computation and Representation in the Brain," Information Sciences, Vol. 180, Issue 9, pp. 1519-1554, January 2010.

4. E. Dickmanns et al., "The Seeing Passenger Car ‘VaMoRs-P'", in: International Symposium on Intelligent Vehicles ‘94, Paris, October 24–26, 1994.

KEYWORDS: Unmanned Systems; Machine Vision; Autonomy; Robotics; Perception; Focus

DoD Notice:  
Between November 20 and December 19 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. Their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting Dec 20, 2013, when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (14.1 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 14.1 topic under which they are proposing.

If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at (866) 724-7457 or email weblink.