Hello and welcome!
While tracking to “air” is a bit tricky, I wonder if this is a situation that extended tracking can solve? There’s not a subsymbol within Studio, but there’s a thread about it here. That being said, if you’re having issues with the tracking image being too far away, that may not be an option for you, as it requires a tracking image to start with.
As far as I understand it, all experiences within ZapWorks require some sort of target/anchor, which is usually an image, a textured ground plane, or someone’s face. You could mayyybe just create a screen-relative experience, but the object would be locked to a portion of the user’s screen, regardless of the backdrop. UI elements like buttons are typically screen relative, but in theory, you could build an entire experience just using screen relative elements…but I’m not sure that would serve the purpose of using AR software well and would likely be better created using a different software…just a thought there though.
All tracking options in ZapWorks require a fair amount of detail for the tracker to latch onto, so having an experience anchored to the air—which effectively means anchored to nothing—may not be possible. I’m more than happy to be proved wrong, though.
For further reading, you could check out these Docs pages: