Hey all, we have been finding some problems with an experience, when viewing a target from a screen and the buttons are in the ar space they seem to be buggy and need multiple taps to activate sometimes not working at all, where as when we printed the target out on a paper the experience went smoothly every time. could this be the type of screen its viewed on?
Ok are you saying that you use your device and point it at a tracing image on a computer screen? When you are having this problems? But when you print out the tracing image it works.
I have had that problem as well but found the size of the image on the screen helps. I would have to set the image to landscape rather than portrait. Also I find my iPhones work better then my Android phones at time.
I’ve had a similar issue as David, where I scan the Zappar code on a Retina screen (15inch macbook pro) then point it to the target on the same screen, Target was displayed full screen, the experience activates but the play button in the experience is non-responsive even after many presses, it might work on a 20th press.
We tried both an Android phone and an iphone, after multiple attempts we printed the experience out on an A4 and the inactive button worked on the first press, also on the same Android and iphone.
Would it be possible for you to share your projects with us so that we can take a closer look please?
Although the fact that you’re not experiencing the issue when viewed in the app, it’s still worth taking a look at what might be happening in the desktop preview. I suspect there may be a priority issue where the interactive object is being blocked by a different object due to the angle at which the experience is viewed on the screen preview.