Engagement with finger behind the camera

I haven’t been able to find this on the board anywhere. I am trying to see if the Zappar platform can create interactions with people ‘touching’ the actual experience objects behind the screen. So essentially if I have a 2D or 3D object after scanning the zapcode, can I build something where I can physically ‘interact’ with the object by getting my finger into the camera view.

I thought the ‘Intersect’ functionality would do it but that seems to be intersecting two objects on the screen. Not the virtual and real world. Thank you ahead of time.

We don’t currently have any code to detect hands in the camera view, so this is not possible.

The nearest thing is the controllers in ZapBox, which do allow you to physically interact with content rendered in the world. See the video here:

That’s only supported in the ZapBox app (not Zappar or the embed components) at the moment. There’s details of how to build content for it in this post:

Hope that helps!

2 Likes

That is very cool. Thank you for the reply.