Optically-Aided, Non-Global Positioning System (GPS) for Aircraft Navigation Over Water
Navy SBIR 2019.1 - Topic N191-003
NAVAIR - Ms. Donna Attick - donna.attick@navy.mil
Opens: January 8, 2019 - Closes: February 6, 2019 (8:00 PM ET)

N191-003

TITLE: Optically-Aided, Non-Global Positioning System (GPS) for Aircraft Navigation Over Water

 

TECHNOLOGY AREA(S): Air Platform, Information Systems

ACQUISITION PROGRAM: PMA266 Navy and Marine Corp Multi-Mission Tactical UAS

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 3.5 of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Design and develop a capability using optically-sensed features of the environment and ocean as external references for augmenting aircraft navigation when flying over water without the use of the Global Positioning System (GPS).

DESCRIPTION: The concept of using optical sensors for navigation has been used extensively in the past. Existing visual navigation solutions have been used within multiple weapon and military aircraft applications, but are limited to use over land, requiring detailed knowledge of the terrain. Features identified from radar or camera images can then be correlated to terrain, or map, features for estimating a given position. In addition, horizontal positioning from down-facing cameras is becoming the industry standard on high-quality, commercial unmanned aerial vehicles (UAVs), especially for use indoors where satellite navigation is not available.

The current state of the art with such vision-based navigation applications does not work over water, where any detected features are not stationary and no terrain information exists for feature matching. In the commercial UAV example, this leads to low-altitude, hovering aircraft drifting with the same speed and direction of the water current below. New innovations are required for using any visually detected non-stationary features between image frames such as wind driven waves/wavelets in the case of a sensor capturing information from a Nadir or Forward-looking imager, and or clouds/clouds formations in the case of a Forward-looking imager with an excess field of view capturing features at the distant horizon. Furthermore, all image content likely needs to be considered by such new innovations. Visual solutions are often complemented by other optical remote sensing technologies that provide detailed range information, such as LiDAR (Light Detection and Ranging).

Without any external aiding, all modern inertial navigation positioning errors will grow over time and quickly lead to the data becoming unusable. Through new innovations using slowly time varying, sea-based features as external references, aircraft navigation systems can be augmented to potentially bound the growing positioning errors during any lengthy aircraft mission. This will lead to aircraft systems less reliant on satellite navigation or radio frequency beacons to transition longer distances across bodies of water. Any such capability can be further expanded with existing land-based visual navigation techniques or emerging ship-relative feature tracking systems to form a comprehensive solution over land, sea, and when in close proximity to ships. Any effort should demonstrate that the inertial positioning errors are bounded over a 60-minute flight envelope for an aircraft translating at speed; for example, a fixed wing aircraft traveling at a same forward speed to avoid stall, or a rotor wing aircraft translating at a speed much larger than any speed of the translating ocean waves below. Reported parameters should consider the speed of the aircraft, speed of ocean features, the type of sensors used, the features being tracked, the probable inertial errors with any additional aiding, and the proposed solution with the same inertial system augmented with a non-GPS optical approach. The inertial system should have positioning errors growing without bound for a flight of a minimum of 60 minutes, or shorter if the errors become larger than 1 nautical mile in any direction.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by DoD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this project as set forth by DSS and NAVAIR in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advanced phases of this contract.

PHASE I: Conceptually develop one or more optically-based solutions that show the feasibility of a new capability in using external (e.g., ocean, sky, and/or any temporary features of opportunity) characteristics to augment aircraft navigation technologies. Provide documentation that demonstrates the suitability of the design for typical aircraft operations and mission environments, and the potential impacts to use without GPS aiding. Aircraft operational and mission environment information will be provided to Phase I performers. Perform a proof of concept demonstration to show the scientific and technical merit, along with a Technology Readiness Level (TRL)/Manufacturing Readiness Level (MRL) assessment. The Phase I effort will include prototype plans to be developed under Phase II.

PHASE II: Develop the optically-based concept into a prototype, perform testing and demonstrate performance of the prototype in a representative flight environment over water with varying sea and atmospheric conditions; aircraft operational and mission environment information will be provided to Phase II performers. Perform tests that demonstrate and validate the superiority of the optically-aided navigation compared to traditional aircraft navigation without external aiding (i.e., using only on-board aircraft systems, such as only air data and inertial). Show the feasibility of aircraft integration. Update the TRL/MRL assessment based on prototype advancements and test results.

Work in Phase II may become classified. Please see note in Description paragraph.

PHASE III DUAL USE APPLICATIONS: Identify requirements for transitioning to U.S. Navy aircraft with support of any appropriate PMA. Expand the prototype solution to satisfy the identified hardware and software requirements for applications to U.S. Navy fleet of aircraft, which may be manned, unmanned, fixed wing, or rotary wing platforms. Perform final testing of a fleet representative solution for at-sea aircraft navigation. Possibly further integrate the developed concepts  with other navigation solutions for a more comprehensive solution to aircraft utilized in regions of the globe where GPS solutions are degraded or unavailable with coverage in both land and sea environments. The general technology can be applied in new and emerging ways for commercial applications in both large and small aircraft industries (e.g., small UAVs for operating over rivers and streams without drift).

REFERENCES:

1. Chahl, J., Rosser, K., and Mizutani, A. “Bioinspired Optical Sensors for Unmanned Aerial Systems.” Bioinspiration, Biomimetics, and Bioreplication. SPIE Proceedings, 2011. https://spie.org/Publications/Proceedings/Paper/10.1117/12.880703

2. Chao, H., Gu, Y., Gross, J., Guo, G., Fravolini, M., and Napolitano, M. “A Comparative Study of Optical Flow and Traditional Sensors in UAV Navigation.” 2013 American Control Conference, Washington DC, IEEE. https://ieeexplore.ieee.org/document/6580428/

3. Chao, H., Gu, Y., and Napolitano, M. “A Survey of Optical Flow and Robotics Navigation Applications.” Journal of Intelligent and Robotics Systems, 2014, pp. 361-372. https://link.springer.com/article/10.1007/s10846-013-9923-6

4. Chao, H., Gu, Y., and Napolitano, M. “A Survey of Optical Flow for UAV Navigation Applications.” 2013 International Conference on Unmanned Aircraft Systems, Atlanta, IEEE. https://ieeexplore.ieee.org/abstract/document/6564752/

5. Chao, H., Gu, Y., Gross, J., Rhudy, M., and Napolitano, M. “Flight-Test Evaluation of Navigation Information in Wide-Field Optical Flow.” Journal of Aerospace Information Systems, 2016, 13(11), pp. 419-432. https://arc.aiaa.org/doi/10.2514/1.I010482

6. Rhudy, M.B., et al. “Unmanned Aerial Vehicle Navigation Using Wide-Field Optical Flow and Inertial Sensors.” Journal of Robotics, Volume 2015, Article ID 251379, 1-12. https://web.statler.wvu.edu/~irl/Unmanned%20Aerial%20Vehicle%20Navigation%20Using%20Wide-Field%20Optical%20Flow%20and%20Inertial%20Sensors.pdf

7. Rhudy, M., Chao, H., and Gu, Y. “Wide-field Optical Flow Aided Inertial Navigation for Unmanned Aerial Vehicles.” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago. https://ieeexplore.ieee.org/document/6942631/citations?part=1

8. Trittler, M., Rothermel, T., & Fichter, W. “Autopilot for Landing Small Fixed-Wing Unmanned Aerial Vehicles with Optical Sensors.” Journal of Guidance Control and Dynamics, 2016, 39(9), pp. 2011-2021. https://www.researchgate.net/publication/305925697_Autopilot_for_Landing_Small_Fixed-Wing_Unmanned_Aerial_Vehicles_with_Optical_Sensors?_sg=5LKoAjFrO4ccPixP4oLcb8VnUcFVTx_ceplrQubvohrjhRLgiyKpvM4gAeuQ1cmJ93zPmlofLCOTbd33hkVzwKGeMZN2

9. Zhang, J., and Singh, S. “Visual-Lidar Odometry and Mapping: Low-drift, Robust, and Fast.” 2015 International Conference on Robotics and Automation, Seattle. http://www.frc.ri.cmu.edu/~jizhang03/Publications/ICRA_2015.pdf

KEYWORDS: GPS Denied; Navigation; UAS; Visually-Aided Inertial Navigation System (INS); Optical Flow; Visual-Lidar Odometry; Unmanned Aerial Systems

 

** TOPIC NOTICE **

These Navy Topics are part of the overall DoD 2019.1 SBIR BAA. The DoD issued its 2019.1 BAA SBIR pre-release on November 28, 2018, which opens to receive proposals on January 8, 2019, and closes February 6, 2019 at 8:00 PM ET.

Between November 28, 2018 and January 7, 2019 you may communicate directly with the Topic Authors (TPOC) to ask technical questions about the topics. During these dates, their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting January 8, 2019
when DoD begins accepting proposals for this BAA.
However, until January 23, 2019, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS during the Open BAA period for questions and answers and other significant information relevant to their SBIR/STTR topics of interest.

Topics Search Engine: Visit the DoD Topic Search Tool at sbir.defensebusiness.org/topics/ to find topics by keyword across all DoD Components participating in this BAA.

Proposal Submission: All SBIR/STTR Proposals must be submitted electronically through the DoD SBIR/STTR Electronic Submission Website, as described in the Proposal Preparation and Submission of Proposal sections of the program Announcement.

Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 800-348-0787 or via email at sbirhelp@bytecubed.com