We’d love to be able to show panoramic / 360 shots in WebAR, but at the moment the only way to initialise the AR experience is with Image tracking, and as soon as the target’s lost, the AR camera freezes. Which is logical.
World tracking isn’t available in WebAR. But is Zap able to access gyro information?
Ultimately we’d like to be able to keep tracking the phone’s approximate orientation (that is, which way the device is pointing in 3D space, not “portrait vs landscape”) even after the target image is lost. That way we could use an image target to start an AR experience, reveal a VR360-style video or image that surrounds the viewer, and let the user look around it with their phone.
I guess this comes down to whether or not ARKit/ARCore gyro data is available to WebAR… is it?