World Tracking only at walls possible?

Hi, first post here since I´m looking into Web AR world tracking.

Is it possible to use “wall” tracking with the Universal AR SDK?
Is it possible to only(!) use “wall” tracking, preventing content to be placed on the “floor”?

Where is the documentation about such things? I couldn´t find a API reference for A-Frame. I only see things in the sample projects. For example this code:

If I want to know more about “placement-mode”, where would I find the info?

After watching the “Introducing new and improved world tracking for WebAR” webinar a few minutes ago I can respond to myself regarding the first two questions:

  1. At the moment there is no plane detection possible. It´s on the roadmap. Therefore not possible to detect a “wall”, nor a “ground”.
  2. Not possible because of 1.

It seems that wall tracking within ZapWorks is not yet possible, yeah. However, a workaround could be to have your content at a fixed height above the ground, and just have the users line up the anchor with the bottom of the wall. George talks more on that workaround in this thread:

1 Like

Hi @abiro, thanks for the reply.
Do you think it´s possible to create all of the shown logic / visuals with Universal AR for WebAR. George says that the demo is in the Zappar app.

EDIT: Just downloaded the .zpp
I`m new to Zappar and the Zapper app. I could extract the content of the .zpp by renaming it to .zip. But I can´t access the code of the sample. Do you provide the logic for the sample in the video? Thanks

And one think I don`t understand is that yesterday in the webinar it was mentioned that atm ground / plane detection is not implemented in Web AR. It´s on the roadmap.

But in the video it seems like the dots on the bottom detect the ground plane? I like what I see in the video and would like to get that working in Universal AR (a-frame). Is the code /sample available somewhere? Thanks

Unfortunately once you get beyond Studio, it goes a bit over my head. I imagine it’s somewhat possible to recreate the experience in the aforementioned thread in UniversalAR, but I just don’t know how to do that :frowning: Sorry i can’t be of more help!

1 Like

Hi @abiro, thanks for your answer and the initial hint to that other thread.
I´m new to the forum: It says ZapWorks Champion, but I think your a developer like me? Not a official ZapWorks dev?
So I hope a ZapWorks dev can help and answer. That´s why I also tagged @George :slight_smile:

Ah! The title “ZapWorks Champion” is given to folks who are really active on the forum and generally knowledgeable on ZapWorks. It just happens that my area of expertise is Studio/Image Tracking. But correct, I’m not hired by Zappar as a developer

The actual badge description is: “Awarded to the most active users of the ZapWorks Forum, who go above and beyond in helping out fellow ZapWorks users, providing awesome project solutions and being a friendly face that users can count on!”

1 Like

Hi @rlonau,

Great questions, happy to help.

“Wall tracking” is not currently possible in any of our tools (there is a workaround, mentioned later).

Although our Studio world tracking implementation is built upon ARKit and ARCore, we don’t expose the horizontal tracking plane tracking as we find it can be quite inconsistent across devices.

For Universal AR and the instant world tracking variant, we don’t currently detect the ground plane so it’s also not possible to detect a horizontal plane.

In the past we’ve seen faked “wall tracking” experiences by placing an indicator on the floor during the placement state. This indicator can be matched up with the bottom of a wall (where it meets the floor) and your content can be positioned as if it where to be placed on the wall above that indicator. When the place button is tapped, the content will become visible and from a user perspective, tracked to the wall.

Our previous implementation of instant tracking wasn’t entirely set up for this workaround due to the anchor point needing to be in the camera view at all times, however, the new beta version of the instant tracker should be much better as the content will be much more stable without the anchor in view. Saying that, we’d still recommend trying to get the end user to stand far enough back so the bottom of the wall and anchor point are visible in the camera view to improve the tracking stability.

Here’s an example of that “faked” workaround in action:

And a QR code if you’d like to test:

As you can see, it’s not amazing but might be ok for some specific use cases. Definitely worth testing thoroughly if you wanted to use it in a live project.

Hope this helps.


1 Like

Hi @George,

thanks for all the info! I wanted to try it out with the given QR code, but I get the error shown after clicking on the “LAUNCH” button. Testing on iPhone X (Safari and Chrome browser tested). Also tested via Wifi and mobile connection, which are working fine with all other apps. Any idea what´s wrong?

Basically I like what I see in the video and want to have this as a Universal AR (a-frame) project to build upon.

  1. Is the use-case shown in the video possible in Universal AR (beta)?

  2. Is this code somehow available? As a tutorial or github project?


Hi @rlonau,

Looks like we’ve found a bug with our new beta site. I’ve reported it to the team. In the meantime, here’s a version of the same project running with the stable implementation of instant world tracking:

(With the example above not using the new and improved instant world tracking for WebAR, the anchor should be in the camera view at all time to maintain stability)

In regards to the other questions:

1. Is the use-case shown in the video possible in Universal AR (beta)? Yes, simply clone the beta branch of A-Frame and setup the instant tracker with the content positioned correctly (above the placed anchor).

1. Is this code somehow available? As a tutorial or github project? Not currently, this was just a quick demo to showcase the idea.

Hope this helps.


1 Like

Hi @George,
thanks for the demo and answers to my questions. I´ve tried it our and also created my own demo:

I´ve attended the last webinar and wanted to ask about the “scale”. Right now it´s not possible to have the correct scale / size. E.g. the TV in your example is 1.5m wide in real life. But in the current Web AR the size varies depending on the view / angle the user holds his device and when he “taps to place” the content, right?!

In the slides in the webinar “scale” was mentioned during the “limitations” and the “roadmap”. My impression when “scale” was described at the roadmap slide was that it´s pretty difficult to get this working on the web? Can you explain a bit and give an estimate IF and WHEN this could potentially be available?

Hi @rlonau,

Thanks for getting back and sharing the demo.

You’re correct. Getting real-world scale is quite difficult to achieve, especially in the browser (with all it’s quirks). The easiest way to do this is by using something we call extended tracking. Extended tracking requires a printed code / target at a certain size. This way we know the size of something in the real-world and can track the digital content accordingly. The content will then world track to environment if the target / code isn’t in the camera view, allowing you to develop projects like the above. This is a feature on our roadmap and will probably be the first shot at getting real-world scale of digital content working in the browser.

This functionality is already implemented in our Studio tool if you wanted to give it a go, see here:

Unfortunately, I don’t have a timeframe on when this will get into our Universal AR SDKs I’m afraid. It’s something on our radar and we’ll definitely look into it this year, however, improvements to the tracking stability of the beta world tracking is higher priority at this point in time. Once that’s out the way we’ll have a better timescale we can share.

I’ll be sure to update this thread if / when we have some updates.