Using Zapcodes in Museum(like) space

I’m an interactive media developer (new to Zapworks but not XR) and am looking at a project with AR attached to museum exhibits.

Based on Zapworks’ capabilities, it seems like using Zapcodes that are physically on the exhibits could work.

Just a couple questions / concerns:

  1. No issue with having a number of zapcodes in the same space? How does it prioritize?
  2. Can zapcodes themselves be used as the anchor location for AR experiences? Must I have an additional tracking image? I did a quick test of using my Zapcode as a source for a tracked image, which seemed to work as both the launcher and the tracker–is that a good way to do this?
  3. My client would probably prefer that the experiences are not portable, as they would be designed to compliment a physical exhibit. Any recommendations to help ensure that the experiences are only used while actually looking at the exhibits?

Thanks for your thoughts.

As far as I understand it, you can have as many zapcode triggers as you wish. Each zapcode triggers its own separate experience. If you mean having multiple visible on screen when you’re trying to scan, I assume that the Zappar app will just scan whichever it recognizes first—so just make sure there’s enough clear space around it for someone to scan the one they were intending on scanning.

Edit: If you intend on only using one trigger, but using zapcodes as multiple tracking images in the same experiene, I highly recommend checking out this thread on multiple tracking images. It’s not officially supported by Zapworks Studio (if that happens to be the way you’re designing the experience) and requires some scripting, but it is technically possible. I will say that if your tracking images are too similar (like I worry using just zapcodes as tracking images would be) you might trigger the wrong element on accident.

First I’ll recommend checking out the Documentation on What Makes A Good Tracking Image if you haven’t read through already. It provides some helpful tips to make sure your experience isn’t buggy or shaky, as having a poor tracking image can result in jitters as Zappar tries to anchor everything to the image. But really, make sure there’s enough visual information for Zappar to latch onto, and know that the tracking image is ‘seen’ by Zappar as a black and white image, so you need to have enough contrast in your lights/darks, as hue/color has no impact there.

That being said, your tracking image can be anything you desire, so if it works in testing, I don’t see why just the zapcode can’t be a tracking image. Not sure what type of exhibit you’re working with, but you don’t have to track an image for an AR experience. World/Instant tracking is an option, as well.

While the zappar app does log experiences that you’ve scanned so they can be opened at a later date, if someone doesn’t have access to a tracking image, then the experience wouldn’t work, so that could be a limiting factor. Granted, someone could theoretically take a photo of your tracking image and take the experience with them, but I’m not sure how many people would actually do that.

I’m not sure if there’s anything that can be done with tracking a user’s location, as that likely involves scripting beyond my personal skill level, so I hope someone else is able to chime in on if that is a possibility or not. I know it’s not really possible in WebAR, which is where my personal focus is, so I apologize for not being much help there.


Hope that makes sense and has at least some value to you!
-Atlas

2 Likes

Thanks @biroatlas, yes, that makes sense. Your replies pretty much confirmed what I’ve determined from my initial review and tests.

1 Like