Instant Tracking WebAR Beta - Example content and API discussion


Hi guys!

I’ve just posted an introduction to our new Instant Tracking feature - read that post first to understand why this one exists!

This post is to share all the internal details of our current InstantTracker API and to share an example project for you to play with. We’d also love to hear your thoughts on the current API and any other ideas you have for it.

Instant Tracking is exposed through a simple API, via the Z.InstantTracker object, which like all Zappar objects, you can construct by calling Z.InstantTracker() in a script.

This is the API exposed by that InstantTracker object:

class InstantTracker {
    // Get / set whether tracking is enabled
    // True by default - to save resources be sure to call enabled(false)
    // when no longer required (for example if your InstantTracker is in
    // a sub-symbol as part of a larger experience)
    enabled() : boolean;
    enabled(v: boolean): this;

    // The anchor managed by this Instant Tracker
    // Set the relativeTo property of a content group to this object
    anchor() : Node;

    // Set or reset the anchor transform relative to the camera
    // Pass in the camera-relative position for the anchor, and one
    // of the orientation options (described later)
        position: number[],
        transformOrientation: InstantTracker.TransformOrientation
    ): this;

The transformOrientation parameter to setAnchorTransformFromCamera allows you to ensure that content placed relative to the anchor initially faces in a known direction. Here are the current options for that parameter:

enum InstantTracker.TransformOrientation {
    // “Ground-like” options

    // “World-like” options

    // Don’t change the world-relative orientation when setting
    // the position part of the transform

The ground-like options all have the z axis pointing upwards in the world (away from gravity), and the world-like ones have their y axis pointing upwards. The option names describe how the orientation around that “up axis” is determined.

The demo content uses z_up_y_heading during the initial placement gesture, and then unchanged for subsequent updates, but feel free to edit the script in the example to get a feel for the other options.

If you hold your finger down that will effectively call setAnchorTransformFromCamera every frame, so by holding down for the initial placement and moving both your finger and the device around you’ll be able to see these options in action.

Here’s the example scene for you to play with:
Instant Tracking Forum Demo.zpp (3.9 MB)

The original model is from The Cleveland Museum of Art on Sketchfab. It’s open access so we can freely share it in example content without needing attribution, but we’re happy to give them a shout out anyway!

All of the important stuff happens in the script node, which is quite heavily commented.

A few points to note:

  • The CustomAnchor in the hierarchy means Studio’s tools for view manipulation can be used when editing the content. It isn’t needed by the InstantTracker at all (the script immediately resets the relativeTo of the tracked group to the InstantTracker anchor).
  • The InstantTracker API is not yet finalized so it has not been added to Studio’s TypeScript definitions yet. Therefore if you just write Z.InstantTracker() Studio will highlight a syntax error. You need to tell Studio you know more than it does using (Z as any) to opt-out of the type checking for accessing members.
  • The TypeScript definitions are also used to determine the minimum version required when publishing an experience, so until these are in place the experience will still load on older versions of Zappar without InstantTracker support. Therefore we’ve added some manual checks that the relevant API is present, and a scene-side message if InstantTracker is not supported. When the API is promoted to stable then this won’t be needed.

I’ll follow up with some more thoughts and open questions on the API, but that should be enough to get you started.

Have fun!

Introducing Instant Tracking

Thanks, @simon, this sounds awesome.

I haven’t yet played with the demo, but I was wondering if I could get a simple 1 or 2 sentence “plain english” explanation of what exactly this does and how it might be used. For example… “Instant tracking allows you to [blah blah] without the need to [bleh bleh], thereby enabling [example use cases].”

Providing a high-level “bird’s eye” synopsis at the start of such a post would really help me gauge its relative importance to current and future projects so that I can determine what priority to place on getting up to speed on the feature.

Thanks much!


I made a separate post with the blah blah blah bits! Will add a link into the first post too.


Awesome sauce! Exactly what I was looking for! Thanks! :+1:


Thank you Simon

I have been trying to use world tracking in studio but it has not been exactly what I have been looking for. In terms of what I envision for my 3d models but what you have described and shown with the instant tracking I am now excited to try this.

Also just a question once the content for example a 3d model is placed onto the horizontal surface and maintains would there be possible for an option to resize the objects once placed?

Thank you


It would also be really cool if the scaling could be automatic based on a reference/trigger image.


Yup, manual rescaling would just need some changes to the content scripts, and so could be implemented with the current Instant Tracking beta. It would just need some sort of UI connected up to the “scale” property of the content group. Two-finger pinch to zoom might seem like the obvious way to go but users might place one finger slightly before the other, which currently would trigger the action to move the anchor to the new tap point. It just needs a bit of thinking about the best user experience here.

Scaling / positioning based on a reference image is definitely on the roadmap. We use the term “extended tracking” to mean the combination of target detection / tracking with Instant or World Tracking to update the content when the target isn’t in view. As soon as Instant Tracking can cope with the anchor point moving out of view, then we’ll look at enabling extended tracking.


I have a Problem


You are using the app. @simon said it only works with the beta WebAr



I am currently experimenting with Instant tracking functionality :slight_smile: It looks very promising.
My question is - Can I place and track more than one object at the same time?


Hi @w.biedron,

The current version is limited to a single anchor per instant tracker. Future versions will work a bit differently and won’t require the anchor point to remain in the camera view. Once those underlying updates are in place we will add multiple anchors into the API.

I wouldn’t really recommend this from a performance point-of-view, but it is possible to use multiple independent InstantTracker objects in your scene with the current beta versions to get multiple anchors.

Hope that helps!