Turning an AR project into a MR (ZapBox) experience

This topic will take you through the steps necessary to turn your existing AR project into an MR project tracked from Zapbox pointcodes.

This process is a temporary solution while we work on improving the ZapBox support within ZapWorks Studio.


Preparation

Before we begin there are a few things you’ll need to prepare:

  1. The ZapBox controllers and world “zpt” tracking file which can be downloaded here - zapbox_controllers_and_world.zpt (21.8 KB). This allows tracking from the pointcodes and controllers within Studio.

  2. The project which we will be using in this example is the one covered in the Step by Step Streaming Video in the 3D View article which can be downloaded here - SBS Video.zpp (1.0 MB) though feel free to follow along using a project of your own.

Project

Please refer to the screenshots below each step for clarification on the actions required.

  1. Open your AR project in Studio. In the project we’re converting we have a 3D model of a TV, a videoplayer and a script node within a group, tracking from a target image.

  1. Drag the controllers and world zpt tracking file into the import box of the Media Library. It should appear as an entry in the Media Library once successfully imported.

  1. Create a new group by right-clicking on the root node and name it ‘world’. This will represent the world space mapped out by the point codes and will be used later on.

  1. Drag the content you want to be tracked to the pointcodes into the world group. In this project the content sits within the TV Group group node, so we drag that into the world group node. This will cause the content to jump to the screen but don’t worry, it will appear correctly when we run the experience in ZapBox.

  1. The previous target in the Hierarchy can now be deleted by right-clicking on it. We’ll be using the ZapBox target file we imported into the Media Library in a script later on. More than one target file is not currently supported in the Zappar platform.

  1. Create a new blank script node from the right-click menu of the root node. This script node will house the code used to identify the pointcodes and place the content on them.

  1. With the new script node selected in the Hierarchy, click and drag the world group node into the script editor window and select ‘Insert local variable’. This gives us a reference to the node which we will use later on.

  1. Create a new Z.TargetFinder (reference code can be seen below) and set its source to the controllers and world target file by dragging it into the script editor window, between the parenthesis, from the Media Library and select ‘Insert media file reference’.

    var targetFinder = Z.TargetFinder();

Your script should now look something like this:

  1. Create a newinstance event handler function for the target finder, which will be passed the instance of the target that has been detected as an argument (reference code below). Please see our General Principles page for more information on event handler functions.

    targetFinder.on(“newinstance”, function(instance) {

    });

  1. Within the newinstance event handler function, add an if statement to check the instance identifier() and see if it equals ‘map’ (reference code below). The map defines the space laid out by the pointcodes.

    if (instance.identifier() === “map”) {

    }

  1. Inside the if statement set the world node (using its variable name in the script) relativeTo the instance identified as the map (see reference code below).

The instance needs to be casted as an any type to avoid throwing an error, this is necessary at this point in time.

world.relativeTo(<any>instance);

  1. To be able to preview our experience better in Studio we’ll create a new group, calling it ‘world-sim’, and place our world group within it.

  1. Set the relativeTo property of the world-sim group to Z.Camera and position it back from the screen in the Z-axis. The amount you’ll want to do this depends on the scale of your content but anything between -1 and -5 should suffice.

  1. The world-sim group can now be positioned, scaled and rotated without risk of altering how the content appears when mapped to the pointcodes within ZapBox. This is because only the world group has its relativeTo property set to the pointcodes, so we can alter the transforms of the world-sim group and it will be ignored when running the experience.

  1. Remove the old tracking file from the Media Library (by right-clicking on it) as it is no longer used, which will reduce the download size of our zap.

  1. Preview or publish the project and scan the code within ZapBox by selecting the ‘Scan Code’ option.

Make sure you’ve calibrated your lens and built your map before scanning the code.

Below is a screen-grab from a device running ZapBox after having scanned our preview code. We set out a simple ‘5 dice pattern’ of pointcodes for the demonstration.


Further Considerations

  1. To position content in the center of the pointcodes make sure all content’s transforms are zeroed out ([0,0,0]) as well as the group(s) they are placed within (not including the world-sim group). In the example we covered the TV group is placed back and to the left a little and is also rotated slightly, which you can see reflected in the screenshot above.
  2. The scale of content is such that a plane with a scale of [1,1,1] is 2 meters in width and height in the real world (1 meter in both directions from the center point). You can use this to serve as a guide when scaling your content.
5 Likes

Thanks for this, worked nicely.

In order to make it a truley MR experience what is the process for making it stereoscopic in order to put it in the headset?

1 Like

Thank you! Now I just have to wait for my Zapbox to arrive… (Was supposed to be here by now)

Hey Alan,

We’ve got the following section on our docs which discusses enabling headset mode within Studio experiences - https://docs.zap.works/studio/headsets/overview/

If you need any help with it let us know :slight_smile:

Mark

Hi Mark

It took me awhile to get back to this but had not luck sorting it, just not as easy as I hoped. Do you think there will ever be an end to end tutorial of how we can get a model in and build a UI, buttons to be be used with the controllers and all in the headset. Bits here and there isn’t great, will we ever see any proper ZapBox tutorials?

Hi Alan,

You will indeed see more in-depth and complete tutorials and documentation, unfortunately, we cannot confirm exactly when this will be.

We’re first looking to improve our ZapBox support within Studio to make it easier and more visible :slight_smile:

Do keep an eye out here and on the docs :slight_smile:

Thanks,
Mark