Instant Tracking WebAR Beta - Example content and API discussion

Hi guys!

I’ve just posted an introduction to our new Instant Tracking feature - read that post first to understand why this one exists!

This post is to share all the internal details of our current InstantTracker API and to share an example project for you to play with. We’d also love to hear your thoughts on the current API and any other ideas you have for it.

Instant Tracking is exposed through a simple API, via the Z.InstantTracker object, which like all Zappar objects, you can construct by calling Z.InstantTracker() in a script.

This is the API exposed by that InstantTracker object:

class InstantTracker {
    // Get / set whether tracking is enabled
    // True by default - to save resources be sure to call enabled(false)
    // when no longer required (for example if your InstantTracker is in
    // a sub-symbol as part of a larger experience)
    enabled() : boolean;
    enabled(v: boolean): this;

    // The anchor managed by this Instant Tracker
    // Set the relativeTo property of a content group to this object
    anchor() : Node;

    // Set or reset the anchor transform relative to the camera
    // Pass in the camera-relative position for the anchor, and one
    // of the orientation options (described later)
    setAnchorTransformFromCamera(
        position: number[],
        transformOrientation: InstantTracker.TransformOrientation
    ): this;
}

The transformOrientation parameter to setAnchorTransformFromCamera allows you to ensure that content placed relative to the anchor initially faces in a known direction. Here are the current options for that parameter:

enum InstantTracker.TransformOrientation {
    // “Ground-like” options
    "ground",
    "z_up_y_away_from_user",
    "z_up_y_heading",

    // “World-like” options
    "world",
    "y_up_minusz_away_from_user",
    "y_up_minusz_heading",

    // Don’t change the world-relative orientation when setting
    // the position part of the transform
    "unchanged"
}

The ground-like options all have the z axis pointing upwards in the world (away from gravity), and the world-like ones have their y axis pointing upwards. The option names describe how the orientation around that “up axis” is determined.

The demo content uses z_up_y_heading during the initial placement gesture, and then unchanged for subsequent updates, but feel free to edit the script in the example to get a feel for the other options.

If you hold your finger down that will effectively call setAnchorTransformFromCamera every frame, so by holding down for the initial placement and moving both your finger and the device around you’ll be able to see these options in action.

Here’s the example scene for you to play with:
Instant Tracking Forum Demo.zpp (3.9 MB)

The original model is from The Cleveland Museum of Art on Sketchfab. It’s open access so we can freely share it in example content without needing attribution, but we’re happy to give them a shout out anyway!

All of the important stuff happens in the script node, which is quite heavily commented.

A few points to note:

  • The CustomAnchor in the hierarchy means Studio’s tools for view manipulation can be used when editing the content. It isn’t needed by the InstantTracker at all (the script immediately resets the relativeTo of the tracked group to the InstantTracker anchor).
  • The InstantTracker API is not yet finalized so it has not been added to Studio’s TypeScript definitions yet. Therefore if you just write Z.InstantTracker() Studio will highlight a syntax error. You need to tell Studio you know more than it does using (Z as any) to opt-out of the type checking for accessing members.
  • The TypeScript definitions are also used to determine the minimum version required when publishing an experience, so until these are in place the experience will still load on older versions of Zappar without InstantTracker support. Therefore we’ve added some manual checks that the relevant API is present, and a scene-side message if InstantTracker is not supported. When the API is promoted to stable then this won’t be needed.

I’ll follow up with some more thoughts and open questions on the API, but that should be enough to get you started.

Have fun!

4 Likes

Thanks, @simon, this sounds awesome.

I haven’t yet played with the demo, but I was wondering if I could get a simple 1 or 2 sentence “plain english” explanation of what exactly this does and how it might be used. For example… “Instant tracking allows you to [blah blah] without the need to [bleh bleh], thereby enabling [example use cases].”

Providing a high-level “bird’s eye” synopsis at the start of such a post would really help me gauge its relative importance to current and future projects so that I can determine what priority to place on getting up to speed on the feature.

Thanks much!

I made a separate post with the blah blah blah bits! Will add a link into the first post too.

2 Likes

Awesome sauce! Exactly what I was looking for! Thanks! :+1:

1 Like

Thank you Simon

I have been trying to use world tracking in studio but it has not been exactly what I have been looking for. In terms of what I envision for my 3d models but what you have described and shown with the instant tracking I am now excited to try this.

Also just a question once the content for example a 3d model is placed onto the horizontal surface and maintains would there be possible for an option to resize the objects once placed?

Thank you

1 Like

It would also be really cool if the scaling could be automatic based on a reference/trigger image.

Yup, manual rescaling would just need some changes to the content scripts, and so could be implemented with the current Instant Tracking beta. It would just need some sort of UI connected up to the “scale” property of the content group. Two-finger pinch to zoom might seem like the obvious way to go but users might place one finger slightly before the other, which currently would trigger the action to move the anchor to the new tap point. It just needs a bit of thinking about the best user experience here.

Scaling / positioning based on a reference image is definitely on the roadmap. We use the term “extended tracking” to mean the combination of target detection / tracking with Instant or World Tracking to update the content when the target isn’t in view. As soon as Instant Tracking can cope with the anchor point moving out of view, then we’ll look at enabling extended tracking.

3 Likes

I have a Problem

You are using the app. @simon said it only works with the beta WebAr
https://beta.zappar.app/

Steve

3 Likes

Hello.
I am currently experimenting with Instant tracking functionality :slight_smile: It looks very promising.
My question is - Can I place and track more than one object at the same time?

Hi @w.biedron,

The current version is limited to a single anchor per instant tracker. Future versions will work a bit differently and won’t require the anchor point to remain in the camera view. Once those underlying updates are in place we will add multiple anchors into the API.

I wouldn’t really recommend this from a performance point-of-view, but it is possible to use multiple independent InstantTracker objects in your scene with the current beta versions to get multiple anchors.

Hope that helps!

I am on an iPhone XR, and I also get this same error.

“This version of the Zappar platform does not support Instant Tracking”

EDIT: My mistake. You have to edit the URL to use the domain beta.zappar.app instead of the usual web.zappar.com.

My biggest problem with the demo project is that I can’t see the 3D model in the viewport. If I change the state to “tracking”, I can see it, but not change any properties anymore (even if I switch the state back or do anything else). Am I missing something? New to Zappar, but so far everything else has been pretty self explanatory.

1 Like

Hi @sascha,

Once you select the tracking state, you can click back on “base” in the list of controllers to set properties that are not overridden by the other controllers.

See the docs page here for more info:

Simon

1 Like

Thank you, that was it. The Controllers window was very unfortunately sized in height, so that I thought “mode” was the top most item…

Hi - when will this be out of beta?
Right now when you use it, you have to first go to “beta.zappar…”, tap launch then scan a zapcode, and then the experience begins…

Hi @wil,

To make this available in the main site, we’ll need to add the API to Studio’s TypeScript definitions so it is associated with a minimum “platform version” level, and then ensure we have that platform version available across both Android and iOS native apps, and the web.zappar.com site. I’m afraid I can’t give a specific date for when this will happen, but I can hopefully fix the practical issue for you on how to share content using Instant Tracking until that happens.

It’s possible to link to a specific project on the beta site - you’ll need to add a query string parameter, so the URL looks a bit like this: https://beta.zappar.app/?zid=.

You can get the deep-link ID either just by looking at the URL you get redirected to from QR code or deep link trigger, or from the “Actions” dropdown in the project overview on ZapWorks:

Instant Tracking is also available through our Universal AR SDKs - in that case WebAR projects are always accessed with a direct per-project URL and you can have complete control of the landing page layout, and you can also generate QR triggers to redirect to the full URL directly in the ZapWorks CMS.

Hope that helps!

1 Like

Hi Simon,

Does this mean you can only trigger Instant Tracking from a Zapcode?

I have an existing project with a QR code (that the client is printing onto a box - so can’t be changed) but when I scan it with my camera then launch the experience, it gets stuck on ‘unlocking’.
How can I redirect it to the beta site instead and still launch from the QR code?

Thanks

Hi Holly,

No, our Instant World Tracking can be triggered with a QR code.

Please can you drop us an email to support@zappar.com with the QR code and as much detail as possible about the project and we can help out. We normally only offer email support to Pro and Enterprise subscribers but I feel like we need a bit more information here, so my team can help out directly over email.

Thanks,

Tom

3 Likes

@tom.defraine
Hi Tom,

I’ve contacted support, sending them my QR Code and project and they’ve pretty much said I have to go through the beta.zappar.app url rather than the proper one.

Is this the case? And if so, is there a way to scan the QR Code and it automatically redirects to the beta url instead?
The Instant Tracking seems to provide the best tracking compared to Image based and World tracking so would prefer to use it. The only barrier seems to be the url.

The support email takes about 4 days to reply and I’ve got a client chasing me for an end of week delivery so thought I’d ask on here in the hopes of a quicker response.

Many Thanks