UAR and World Tracking

Is this possible yet. I tested out Instant tracking but this doesn’t achieve the result I’m after. I want to use UAR via Unity SDK to create a world tracking app that allows me to fix ar content to a table.

Hi @mhussar,

I’d recommend checking out the thread below and make use our beta version of instant world tracking for Unity.

Hope this helps.

George

Thanks so much George. I’ll do as you suggest.
Cheers,

Michael

1 Like

Hi George,
I did as you suggested and rebuilt a project. I have to say the performance difference between World Tracking via Zapworks Studio and Instant World Tracking via UAR (Playcanvas, Unity or Three.js – I tested all 3) is night and day. World Tracking via Zapworks Studio is excellent, rock-solid but Instant World tracking via UAR is, quite frankly, unusable for production. I even tested the robot that was created by the Zappar team and it is not stable. Not sure if I’m doing something wrong (can’t imagine I am since its pretty straight-forward) but the AR content does not stay fixed in position nor scale. If you move your mobile phone camera it “drags” along with it. It does not stay in place like in the Zapworks Studio World Tracking implementation. Perhaps this is just as good as it gets for now regarding UAR? Please let me know what I’m missing here.
Thanks!

Hi @mhussar,

Thanks for the feedback.

Full world tracking in ZapWorks Studio is built upon and utilizes ARKit and ARCore, which is fantastic for native applications. With this implementation it has a pretty good knowledge of the environment but it requires the phone wiggle at the start to first define this space.

Instant world tracking (WebAR) uses a slightly different bespoke implementation enabling us to support environment tracking in the web browser. We call this “instant” as it doesn’t require the phone wiggle and instead does it’s best to track to a patch in front of the camera when the anchor is placed. There’s a few caveats that come with instant world tracking because of this. Generally we recommend looking at a textured surface, so the anchor has something to “stick” to. Secondly, keeping the anchor in the camera view (where you can) helps improve the stability.

Saying that, our beta version of instant world tracking should improve upon these points and make them less of a requirement to maintain good tracking stability (see Simon’s post below for more info).

I’ve attached an example of instant world tracking, working well on a textured surface with some pretty swift camera movements. You’ll notice that there is some drift but all in all it’s pretty stable.

When positioning content in your experience, it is extremely important that you position anchor-first, not content-first. This means that you should start by manipulating the Instant Tracker’s setAnchorPoseFromCameraOffset(x, y, z) parameters, not by positioning specific content (e.g. a mesh) x, y or z positions.

It may also be helpful to make use of axes helpers when developing your experience. These helpers aid to visualise the 3 axes (x, y, z), usually with a colour coded, gizmo-type appearance. Axes helpers are usually more helpful for developing with web technologies, as tools like Unity & PlayCanvas inherently have visual aids.

I wonder if you can share the example you’ve put together so we can give it a try and see if there’s anything obvious that you might be running into?

George

1 Like