SLAM vehicle tracking

A larger-scale visual-inertial SLAM tracking case using a car and comparisons against ARCore, AR Engine, ARKit, and Intel RealSense.

We used a rig with three identical Huawei Mate 20 smartphones (running Spectacular AI VIO tracking, Google ARCore, and Huawei AR Engine, respectively), one iPhone 11 Pro (running Apple ARKit), and an Intel RealSense stereo VIO tracking device that was outside the car.

The hardware running Spectacular AI/ARCore/AR Engine is identical (even if ARCore and AR Engine have access to vendor APIs), while ARKit and RealSense have their own different sensors/hardware.

Spectacular AI uses the camera+IMU alone, which also shows when the car moves straight with constant speed (monocular VIO looses observability). This corrects itself once the car turns (same visible in ARKit). Complementary information such as GNSS, stereo cameras, or assumptions about speed can of course solve this issue.

alt text

Above a more comprehensive list of output paths from more test cases. The GNSS/GPS tracks represent ground-truth, and the results are all visualised using the same scale (scaled individually for each case, scale bar for 100 m).