Track Zapcode without image tracker


Is it possible to create an AR experience with the Zapcode only? without image tracker.



Hi @aumentapp, thanks for the question.

It partly depends on which ZapWorks tool you’re using and the kind of experience you’re creating.

If you’re using Designer, you can only make image-tracked experiences. If you do not specify a target image, it will track to the Zapcode by default - but that’s not an approach we recommend as there’s no call-to-action for the user, the scaling will be off and fundamentally it being high contrast means it’s not good for tracking. For reference on what does make a good tracking image, you can check out our guide.

If you’re using Studio, yes that is possible. If you’re creating a face tracked or world tracked experience, these track to a user’s face or a place specified in the world around the user respectively.

More generally, if you do not set-up a tracking image in Studio, the experience will default to being screen relative. So when triggered, the experience would play out on the user’s screen without being anchored to something in the world. Our 3D Photo Feature project is an example of a screen relative experience, for reference.

Hope that helps!


Thanks for the answer. Very helpful.
I am trying to create a project like this:
I think there are two ways:

The second option was impossible because the Zapworks Designer and Studio don’t recognize the image like a tracker.

What would be the best way to create a project for tv shows?

Thank you.


You’re very welcome.

The experience you’ve noted there is not using a tracking image - that banner image is just there for branding and doesn’t serve a function in terms of accessing the AR content - the only active part is the Zapcode.

So in this example, the Zapcode is scanned and the experience appears and it is screen relative. This is only possible in ZapWorks Studio - if you do not specify a tracking image and it’s not a world tracked or face tracked experience, it will appear relative to the position of your phone screen. That’s why on the video linked to the Twitter post you shared here, when the user moves the phone around, the experience stays stable in the same position - it’s not tracked to any image or any space in the real world.

As recommended on Twitter, creating a screen relative experience would be the way to replicate the example you’re talking about. This tutorial tracks in this way - so should cover the key things you need to know about creating an experience like this yourself.




Thanks, James.
That was I did.
Check: :grin: