Spatial Data Comparison for Markerless Augmented Reality (AR) Anchoring
Navy SBIR 2020.1 - Topic N201-019
NAVAIR - Ms. Donna Attick - donna.attick@navy.mil
Opens: January 14, 2020 - Closes: February 26, 2020 (8:00 PM ET)

N201-019

TITLE: Spatial Data Comparison for Markerless Augmented Reality (AR) Anchoring

 

TECHNOLOGY AREA(S): Human Systems, Information Systems

ACQUISITION PROGRAM: PMA251 Aircraft Launch & Recovery Equipment (ALRE)

OBJECTIVE: Develop a software solution to localize an augmented reality (AR) headset user within a space by making a comparison between spatial mapping data collected live from the headset and scanned/modeled data collected at an earlier time and stored on the device. The proposed solution should work with an existing, commercially available AR headset.

DESCRIPTION: The Navy and Marine Corps currently have several efforts underway looking at applying AR technology to provide maintainer guidance, and improve maintenance-action success rate and repair time. Many current commercial off-the-shelf (COTS) AR hologram anchoring solutions make use of the device's onboard camera and fiducial marker detection to localize a user in space and overlay instructions, animations, warnings, schematics, and technical data but these solutions are limited by the chosen device's camera quality and computational power. Target-based solutions also mandate that a physical marker be placed on the piece of equipment to be detected, which is unacceptable in a number of maintenance environments. More powerful image and object recognition technology exists that foregoes the need for a fiducial marker but these solutions are heavily dependent on the upload of government data to proprietary cloud services which severely restricts utilization due to  both government data sensitivity and cyber limitations on internet access.

The need exists for a method of anchoring holographic overlays in space without the need for internet/cloud access and physical markers. Ideally, this solution would extract a bounding box of a piece of equipment using key feature set comparisons between a computer aided design (CAD) model/3D scan file and real-time spatial mapping room scans generated by the chosen AR device. Some system of aligning the live spatial mapping data with the stored digital twin would allow for the precise and accurate placement of holographic overlays anywhere within the user's scanned region without any dependency on camera functionality. Such a solution would need to achieve 90% precision and accuracy for equipment of varying size and complexity (from 1 to 500 cubic feet) while providing very limited performance degradation (maintaining 60 frames per second) on the device. This solution must also be capable of identifying the same piece of equipment in multiple and varying room spaces (i.e., identification independent of the space itself and the equipment’s location within the space). Integration of the solution with the Unity Game Engine or other common AR application authoring tools is required and full documentation of all programming Application Program Interfaces (APIs) is expected to allow for future government developer use/interfacing.

PHASE I: Design, develop and demonstrate feasibility of a software to meet the requirements provided in the Description. Design a high-level software and use a simplified example of the methodology as a proof-of-concept. The Phase I effort will include prototype plans to be developed under Phase II.

PHASE II: Build and demonstrate a prototype system for a chosen AR headset and test in both interior and exterior environments to highlight capability in lighting conditions that range from bright sunlight to darkness in all weather conditions.

PHASE III DUAL USE APPLICATIONS: Further develop the solution on chosen AR device. Transition as Support Equipment within other Navy-developed applications.

Development of a software tool for environment/model alignment and hologram anchoring that foregoes the need for an off-premises cloud backend will be marketable to aircraft, automobile, and heavy equipment manufacturers along with all other companies that have sensitive/proprietary models they do not wish to cover with fiducial markers or upload to a private entity’s servers.

REFERENCES:

1. Liu, L., Li, H., & Gruteser, M. Edge Assisted Real-time Object Detection for Mobile Augmented Reality. Proceedings of The 25th Annual International Conference on Mobile Computing and Networking, 2019. doi:10.1145/3300061.3300116.  www.winlab.rutgers.edu/~luyang/papers/mobicom19_augmented_reality.pdf

2. Dow, E. M., Farr, E. M., Gildein, M. E., II, & Vaughan, M. J. U.S. Patent No. US 10,169,384 B2. Washington, DC: U.S. Patent and Trademark Office, 2019. https://www.researchgate.net/profile/Eli_Dow/publication/330090603_Augmented_Reality_Model_Comparison_and_Deviation_Detection/links/5c2ce07192851c22a3554b5c/Augmented-Reality-Model-Comparison-and-Deviation-Detection.pdf?origin=publication_detail

KEYWORDS: Augmented; Mixed; Reality; Spatial; Fiducial; Hologram