3D Visualization Capability for Fleet Remotely Operated Vehicles (ROVs)
Navy SBIR 2019.2 - Topic N192-099
NAVSEA - Mr. Dean Putnam - dean.r.putnam@navy.mil
Opens: May 31, 2019 - Closes: July 1, 2019 (8:00 PM ET)


TITLE: 3D Visualization Capability for Fleet Remotely Operated Vehicles (ROVs)


TECHNOLOGY AREA(S): Battlespace, Electronics, Sensors


ACQUISITION PROGRAM: Low Observable, No Collateral Damage (LO/NCD) Neutralization FNC


OBJECTIVE: Design and develop a 360-degree, three-dimensional (3D) visualization system with integrated virtual reality (VR) hardware and software for expeditionary fleet Remotely Operated Vehicles (ROVs) to enhance environmental situational awareness and underwater depth perception.


DESCRIPTION: Current underwater ROV camera systems only provide a two-dimensional (2D) view of the underwater environment due to usually having only a single front-mounted camera, which is inadequate for fleet operators to effectively perform the range of tasks associated with countering underwater explosive threats such as naval mines, unexploded ordnance, and maritime improvised explosive devices (IEDs). Due to a lack of depth perception and a limited field of view, 2D systems lack the visualization and situational awareness capability to execute the fine spatial movements needed for (1) target inspection, characterization, and identification; and (2) system placement and orientation for diagnostic sensing, precision manipulation tasks, and tool placement. Mission risks associated with a lack of depth perception can be mitigated by leveraging advancements in 3D visualization technology, including VR systems.


A 360-degree, 3D visualization capability integrated as “plug-and-play” payload on the currently fielded Teledyne SeaBotix vLBV300 and ultimately on the Next Generation Explosive Ordnance Disposal (EOD) Underwater Response Vehicle in the acquisition pipeline is required to provide EOD operators with greater overall situational

awareness (SA) on both the target of interest and the surrounding environment. Additionally, a real-time visualization capability is needed to provide the depth perception for refined vehicle and manipulator movement and control when operating in close proximity to threat devices being inspected or when accomplishing required preparatory steps for neutralization.


State-of-the-art commercial underwater 3D technology currently available for purchase is cumbersome (large volume/high power), expensive, requires the transmission of large volumes of digital data, and extensive post- processing. Ruggedized commercial solutions are unsuitable in their current form factor for use on small ROVs operating from small rubber craft for Navy expeditionary missions in the near shore undersea environment. The Navy is seeking low-cost 3D visualization solutions, with a production cost not exceeding $50,000 that meet size, weight, and power (SWaP) constraints unique to inspection class ROVs, and that overcome limitations associated with both 2D camera display and the processing-intensive software burden associated with multiple camera solutions.


The payload will be physically integrated onto the Teledyne SeaBotix vLBV300 ROV for the initial development concept, and ultimately onto a similar size Next Generation EOD Underwater Response Vehicle once the Navy has down-selected from ongoing acquisition efforts. Integration must provide both a 360-degree 3D visualization of the underwater environment and depth perception at the working end of the installed manipulator(s) for the ROV operator. As a threshold, the payload should weigh no more than 5 pounds in air and be neutrally buoyant in the water or within the buoyancy reserve of the specific ROV. The payload will use 12V or 28V DC power supplied by the vehicle with a data bandwidth requirement of no more than 1/2 of the available system bandwidth (approximately 20-200 mb/sec). The housing for the payload should be waterproof to a depth of 1000 feet seawater. Processing of camera imagery to provide a 360-degree, 3D visualization should be accomplished in real time so that the ROV operator sees no time lag in the picture.


The development effort will require analysis of the system and software architecture of the Teledyne SeaBotix vLBV300 and the available hardware/software trade space that would enable the development of a modular, plug- and-play 360-degree, 3D visualization system with VR capability that can be rapidly integrated within this architecture. The effort should also investigate integration of the technology into VR head-mounted displays in the event that organic ROV capabilities are deemed inadequate for integration of the 3D visualization/VR functionality. A critical aspect will be defining the focal point and lens requirements needed to provide depth perception for manipulation tasks. Analysis should also include an operational summary of the expected performance capabilities. Characterization of the design for robustness in terms of ROV motion in all three axes, at between 0 to 3 knots in speed, under the influence of current, and at different target object distances will be required. An initial characterization of the ability to provide depth perception in the range of 2 – 6 inches from the leading edge of the manipulator with a field of view =120 degrees is required. Efforts will include summary considerations for ensuring system compliance with DoD cyber security policies and guidelines as articulated in DoD Instruction 8500.01 of 14 March 2014 for software integration onto remotely operated vehicles and human-supervised autonomous weapons systems, and an estimate of unit cost and maintenance cost for the payload to aid in transition planning.


PHASE I: Develop a conceptual design of a 360-degree, 3D visualization system with VR functionality (“payload”) that meets the requirements described in the Description. Demonstrate the feasibility of the concept through modeling and simulation. Develop a Phase II plan. The Phase I Option, if exercised, will include the initial design specifications and capabilities description to build a prototype solution in Phase II.


PHASE II: Develop two prototype systems to be validated against the objectives stated in the Description. Develop prototypes for a Next Generation EOD Underwater Response Vehicle. Produce sequential development of two prototype 360-degree, 3D visualization subsystems to support Navy testing and evaluation. Ensure that these prototypes enable 360-degree, 3D visualization with a VR capability for an ROV operator and one local observer. If necessary for initial demonstrations, system power and video data for the first prototype can be transmitted through an independent cabling system that is married to the ROV tether and terminated at an independent computer console. Based on lessons learned during the integration of the first prototype, design the second prototype as an integral subsystem of the ROV with no external cabling or computers, except for the VR headset. Test these prototypes in both controlled and operationally relevant underwater environments, in varying ambient light conditions ranging from bright sunlight conditions in shallow water (e.g. < 20 fsw), to no-light conditions at night

with little to no lunar illumination, and in highly cluttered environments in the vicinity of targets of interest. Perform prototype testing and evaluation that seeks to characterize the quality, consistency, and stability of the 3D imagery, along with a side-by-side comparison of manipulation tasks using legacy 2D imaging capabilities versus the 3D visualization capabilities.


PHASE III DUAL USE APPLICATIONS: Support the Navy in transitioning the technology to Navy use. Optimize the design and performance of the 360-degree, 3D visualization system based on Phase II testing. Deliver three prototypes for a fleet operational demonstration, and any Navy verification and validation testing and evaluation.

Perform operational demonstrations that focus on the fleet operator’s ability to execute the fine spatial movements needed for target inspection, characterization, and identification; and system placement and orientation for diagnostic sensing, precision manipulation tasks, and tool placement on a ROV.


This capability has dual use potential, providing capabilities for EOD and other DoD and non-DoD agencies who deal with unexploded ordnance remediation, maritime improvised explosive devices response, post-incident salvage and recovery operations, post-blast forensic analysis, and other scientific applications.


If successful, a 360-degree, 3D visualization system with VR capability has broad application in the light work and observation class ROV market, not only for military applications discussed above, but for the oil and gas industry, environmental and maritime habitat inspection, and other commercial applications.



1.   Carroll, James. “3D Vision System Observes Underwater Reef Habitat.” Vision Systems Design, 26 January 2015, pp. 1-3. https://www.vision-systems.com/articles/2015/01/3d-vision-system-observes-underwater-reef- habitat.html


2.   Lin, Qingping and Kuo, Chengi. “On Applying Virtual Reality to Underwater Robot Tele-Operation and Pilot Training.” International Journal of Virtual Reality, Volume 5, Issue 1, 2015, pp. 71-91; https://hal.archives- ouvertes.fr/hal-01530598/


3.   Domingues, Christophe, Essabbah, Mouna, Cheaib, Nader, Otmane, Samir, and Dinis, Alain. “Human-Robot- Interfaces based on Mixed Reality for Underwater Robot Teleoperation.” IFAC Proceedings Volumes, Volume 45, Issue 27, 2012, pp. 212-215; https://www.sciencedirect.com/science/article/pii/S1474667016312307


4.   Bruno, F., Bianco, G., Barone, S., and Razionale, A. V. “Experimentation of Structured Light and Stereo Vision for Underwater 3D Reconstruction.” ISPRS Journal of Photogrammetry and Remote Sensing, Volume 66, Issue 4, 11 July 2011, pp. 508-518; http://www.academia.edu/16777368/Experimentation_of_structured_light_and_stereo_vision_for_underwater_3D_r econstruction


5.     Department of Defense Instruction 8500.01, “Cybersecurity”, 14 March, 2014. https://fas.org/irp/doddir/dod/i8500_01.pdf


KEYWORDS: 3D Visualization; Virtual Reality Display; Underwater Depth Perception; Enhanced Situational Awareness; Remotely Operated Vehicles; Underwater Explosive Threats



Steven Murphy







Rich Arrieta







NOTICE: The data above is for casual reference only. The official DoD/Navy topic description and BAA information is available on FedBizOpps at www.fbo.gov/index?s=opportunity&mode=form&id=0a3eac1d27ab54cfe57a0339b3f863d8&tab=core&_cview=0

These Navy Topics are part of the overall DoD 2019.2 SBIR BAA. The DoD issued its 2019.2 BAA SBIR pre-release on May 2, 2019, which opens to receive proposals on May 31, 2019, and closes July 1, 2019 at 8:00 PM ET.

Between May 2, 2019 and May 30, 2019 you may communicate directly with the Topic Authors (TPOC) to ask technical questions about the topics. During these dates, their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting May 31, 2019
when DoD begins accepting proposals for this BAA.

Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 800-348-0787 or via email at sbirhelpdesk@u.group