Zappar with VR Headset

Since you decided to close this topic: Augmented reality with headset

… I’ve to open a new one.

How can I use Zappar app in Headset mode and target multiple “things” ?

I hope you can help me on “how to do this” specifically.

Thank you

Hi There,

The reason I closed the topic was to avoid the same discussion taking place on different duplicate threads, apologies if your question was not fully addressed in the topic linked.

Can you clarify what you mean by “targeting multiple things”? Once you scan a code and enter headset mode you’ll only be able to track content from a single tracking image or, in the case of ZapBox, a map defined by pointcodes.

Cheers,
Mark

What I want is to use the app in headset mode (with one camera stream for each eye), can I do that?

Hi @rafaviana.26,

What you want to do isn’t supported that easily at the moment. There is VR headset mode which will show the camera stream in stereo, but it won’t be a proper stereo image (both eyes will see the world from the POV of the camera) so moving around large distances is a bit uncomfortable.

ZapBox is our video-see through stereo AR solution, but that still works best in smaller areas where you move slowly around the content on the floor or table in front of you, rather than exploring larger spaces. With ZapBox we render the camera image on the “world plane” so if the world is flat (eg a table) then it looks correct in stereo and is more comfortable to use.

If you want to try it anyway, you can enter headset mode from a ZapWorks Studio scene. At the moment if you want to track multiple targets independently you’ll need to use a zapcode tracking method. Add the “circle-likely-vert” zapcode target from the media library (if you’re putting the codes on vertical walls). You’ll need to construct a Z.TargetFinder in a script node with that media file as the source, and listen to “newinstance” events for each new code that it sees, and then associate whatever content you want with that instance (and listen for “seen” and “notseen” events on the instance to hide/show the content). This is a bit similar to how the ZapBox demos are set up, so it’s worth taking a look at those for inspiration. One more caveat - right now the zapcode tracking only tracks a single code at a time; but if you’ve got single codes on opposite walls that might be OK for you.

So it’s not a use case we’ve optimised for really but is possible in a fashion.

Hope that helps a bit.

2 Likes

Which is the best VR headset for this app? I know Google and Sony provide quality products of this kind. What else other options?