Hi everyone,
We’ve been hard at work on our new and improved World Tracking for WebAR implementation for a while now, and I’m really pleased to be able to share a first look now with our great forum community, just in time to see in the New Year!
This new implementation removes a number of the big limitations of our previous Instant World Tracking approach - you can now look away from the placement position and the content will continue to track, and we have also removed the requirement for a textured horizontal surface to place content.
Here’s a video of this first look build in action on a typically cloudy and wet winter’s day at the glourious British seaside
You can try it out yourself by scanning the QR Code, or visiting this link on your smartphone.
The source for this demo is available in the zip below, which also contains the standalone build of the Universal AR SDK for Three.js with the updated World Tracking implementation. This is exposed via the same InstantWorldTracker
API as previously - simply set the pose of the anchor with a camera-relative position (and orientation option), and the library will aim to keep the content anchored at that position in the world as the user moves. In the demo, we reset this every frame during “Placement” mode, and then just stop calling the setAnchorPoseFromCameraOffset
function once the user has placed the content. Other UX choices are possible with the same API.
RobotWorldTrackingDemo.zip (3.8 MB)
We’re sharing this now as it is in a state where we believe it is useful and it has a number of clear advantages over the previous implementation. There are still some caveats: it works best in static, well-textured environments, with smooth camera motion - moving the device position without rotating is best at the start (much like the general side-to-side motion that works best for initializing native ARKit and ARCore experiences). This implementation does also make portal-type experiences possible but as they often encourage fast rotational motion they are likely to suffer more from scale drift than outside-in content where the user is primarily focussing on content in front of them.
Zappar is officially closed this week for the holidays, so we’ll have much more to say about this early in 2022. One of the first tasks will be making this implementation available across more of our Universal AR SDKs and distribution types (such as making beta libraries available via npm
), and looking at an update to the ZapWorks Studio WebAR sites too.
I’d like to thank the awesome Zappar team for all their work to make this first release possible. This is just the first milestone on our roadmap to a more complete World Tracking for WebAR implementation. We’ve got performance and quality improvements to come early in 2022, we’ll be working on specific improvements for portal-type experiences, and we’ll also be looking at API additions now we have the required groundwork in place for supporting other use cases such as multiple anchors and extended tracking (combining World Tracking and Image Tracking, for example).
Wishing everyone all the best for the New Year - we’re really excited by what we’ve got in store for 2022!