Technology

Desktop HD Copy 18.png

How it works

Luftronix Fused Flow is subject to a pending patent filed with the US Patent and Trademark Office.

Luftronix scans large complex 3D objects with the help of non-GPS aerial imaging, flying autonomously in dense airspace.

With the help of highly precise Fused Flow positioning systems, Luftronix drones autonomously manage their flight paths and capture accurate location information at all times.

Using Fused Flow navigation, Luftronix drones are able to scan most commercial aircraft in less than 30 minutes while capturing a precise map of the surface and stitching the captured images on the surface map.


 

Fused Flow

Optical Flow and Fused Flow™.

Luftronix Fused Flow uses optical input as its primary source of navigation. It achieves its superior precision by measuring the displacement of patterns found naturally in the environment and by calibrating these measurements with the help of auxiliary input sources. Fused Flow relies on a high frame rate (60Hz) paired with highly efficient position computation algorithms to achieve real-time location information, which is essential for navigating safely in full autonomous mode.

Optical Flow is based on James Gibson’s observation of how humans and animals interpret relative motion as they travel through the world. It has been extended into a mathematical framework for interpreting discrete image displacements and the algorithms were analyzed by Barron, Fleet and Beauchemin in 1994. Today it is considered a known method to utilize electro-optical input sources to determine the rate of displacement over time. By itself, Optical Flow suffers from larger than acceptable error accumulation and is therefore only useful for short sections of a mission.

Luftronix has complemented the strategy of using displacement information by augmenting it with additional sources of input for a FusedFlow™ model of displacement. This model allows for additional relevant factors, such as the angle of the camera to the surface, elevation, heading, and possible temporary view obstruction to a complete model of motion. This results in a more precise optical flow algorithm that is still able to operate in real time. Luftronix FusedFlow™ uses a sample consensus method to implement a robust algorithm that compares markers in two subsequently taken images and determines the size of the vector of movement between the two samples of the surface, after taking into consideration factors like altitude, angle and rotation – all of which may have changed between the two samples.

Location information is used to annotate data collected during flights: Every image or other data point receives an automatic annotation with the exact position where it was captured. Using this technique, a typical scan of a commercial aircraft will be completed in under one hour.