SLAM for full 6D VR/AR


This tutorial will show you how to use SLAM tracking for full six degrees of freedom navigation within a VR/AR environment. By using SLAM you can turn your device, smart phone or tablet into a 6D navigation device for moving around freely in a VR/AR scene. You are not limited to rotation/orientation only that conventional approaches use but can also move left/right, up/down, forwards/backwards.

This tutorial also tells how to achieve a good user experience

  • if the tracking gets lost (by "bridging this event with gyro-support) and
  • when the tracking re-initializes/picks up again (by keeping the virtual objects at the correct position and not making them "jump" within the scene).

More technically speaking this means that your SLAM tracking coordinate system will not be reset when the tracking is lost, giving the user an overall better performance. Additionally, the gyroscope is used when the tracking is lost. Therefore you can keep interacting with the scene until tracking picks up again.

What you will need

In order to run the demo you will need to execute the following steps:

  • Download the latest Metaio SDK (version 5.5.1 or higher) and install it
  • Download the package
  • Extract the file to the folder _iOs/Example_SDK with in your SDK install path
  • Open the project file: Demo.xcodeproj
  • Attach an iOs device and run the application


Now, we will take a look at the code.

The most important part of this application is the header file named MapTransitionHelper.h. The following steps are needed to use this helper class in your own applications:

  • Include this header file in the class you want to equip with the SLAM based transition functionality for bridging the cases where tracking gets lost (See the file
  • Create an instance of that class in the constructor


Create your models and add them to COS 0, which is the camera coordinate system. It is important to put the models on COS 0 because we will get the tracking values in every frame for COS 1 (the actual tracking coordinate system of SLAM) and pass them to the MapTransitionHelper class, in order to get the rotation and translation values from the fused camera pose. This new fused pose will be used to update the rotation and translation of the models manually.

You can find the according sample code inside the method createModel of  the file

Next, create an update function (see the method update() in that is executed every frame. Use the helper functions to get the latest rotation and translation. Apply the newly obtained rotation and translation for your model:

// Get the rotation of the "fused" camera pose
metaio::Rotation newRotation = mapTransitionHelper.getRotationCameraFromWorld();
// Get the translation of the "fused" camera pose
metaio::Vector3d newTranslation = mapTransitionHelper.getTranslationCameraFromWorld();
// Apply the scale
// Apply the new rotation
// Apply the new translation

Tracking configuration

The tracking configuration in this tutorial and sample application uses a couple of special switches:

  • ContinueLostTrackingWithOrientationSensor: true and
  • KeepPoseForNumberOfFrames: 165
  • GravityAssistance: replaceZAxis

The first two parameters automatically use the orientation sensor/gyroscope of your device in cases where the SLAM tracking gets lost.

The third one stabilizes tracking by using the gravity sensor of your device however also requires the 3D content to be aligned with gravity, which is the case in this scenario.

You can see the provided sample tracking configuration below:

  1. <?xml version="1.0"?>
  2. <TrackingData>
  3.   <Sensors>
  4.     <Sensor Type="SLAMSensorSource">
  5.       <SensorID>SLAM</SensorID>
  6.       <Parameters>
  7.         <Initialization>
  8.             <BaseLineAngleThreshold>2</BaseLineAngleThreshold>
  9.         </Initialization>
  10.         <MinNumberOfObservations>2</MinNumberOfObservations>
  11.         <SimilarityThreshold>0.7</SimilarityThreshold>
  12.       </Parameters>
  13.       <SensorCOS>
  14.         <SensorCosID>world</SensorCosID>
  15.         <Parameters>
  16.             <MinTriangulationAngle>2</MinTriangulationAngle>
  17.         </Parameters>
  18.       </SensorCOS>
  19.     </Sensor>
  20.   </Sensors>
  21.   <Connections>
  22.     <COS>
  23.       <Name>COS0</Name>
  24.       <Fuser Type="SmoothingFuser">
  25.         <Parameters>
  26.           <AlphaRotation>0.5</AlphaRotation>
  27.           <AlphaTranslation>0.8</AlphaTranslation>
  28.           <GammaRotation>0.5</GammaRotation>
  29.           <GammaTranslation>0.8</GammaTranslation>
  30.           <KeepPoseForNumberOfFrames>165</KeepPoseForNumberOfFrames>
  31.           <ContinueLostTrackingWithOrientationSensor>true</ContinueLostTrackingWithOrientationSensor>
  32.           <GravityAssistance>replaceZAxis</GravityAssistance>
  33.         </Parameters>
  34.       </Fuser>
  35.       <SensorSource>
  36.         <SensorID>SLAM</SensorID>
  37.         <SensorCosID>world</SensorCosID>
  38.       </SensorSource>
  39.     </COS>
  40.   </Connections>
  41. </TrackingData>

We hope you enjoyed this tutorial and can come up with fantastic AR scenarios.

Your Metaio Team

This tutorial teaches you

Using SLAM for full 6D (rotation and translation) navigation within a VR/AR scene

Key SDK functions

Using the metaio::MapTransitionHelper class from the sample

Table of Contents