I haven’t been able to find this on the board anywhere. I am trying to see if the Zappar platform can create interactions with people ‘touching’ the actual experience objects behind the screen. So essentially if I have a 2D or 3D object after scanning the zapcode, can I build something where I can physically ‘interact’ with the object by getting my finger into the camera view.
I thought the ‘Intersect’ functionality would do it but that seems to be intersecting two objects on the screen. Not the virtual and real world. Thank you ahead of time.