Will Zapworks integrate ARKit capabilities for iOS?


#1

I guess it would remove the need for positional trackers, but would still draw benefits from being able to use the marker controllers / other object markers. Is this in the pipeline?


#2

Good question. It’s certainly something we’re thinking about.

For the “normal” Zappar app without a fish-eye adapter then we could directly wrap the ARKit tracking with a similar Javascript API for people to access in Studio. One downside is that would be the first time we had a feature only supported on one of the platforms.

For ZapBox we have the fish-eye lens adapter which I suspect would prevent the tracking in ARKit from functioning correctly.

So nothing to announce right now in terms of commitments but it is on our radar :slight_smile:


#3

Hi Simon, I’m interested to hear what you think about ARCore’s development for Android. Could ZapBox use ARKit for iOS and ARCore for Android? From my limited understanding, the fish eye lens adapter helps with tracking the pointcodes. Would ARKit and ARCore eliminate the need for pointcodes and fish eye lens?


#4

We could certainly make use of ARCore on android, will take some investigation into whether there’s a common feature set supported across both ARKit and ARCore but it seems Google designed their API with compatibility in mind.

In terms of ZapBox the fish-eye lens is important to increase the field of view for interaction with the controllers, and also to enlarge the camera view we can render in stereo - without the fish eye lens it feels a bit letterboxed. Neither ARKit or ARCore would work with that lens (I expect) so for ZapBox in the short term pointcpdes will still be needed when using the fish eye lens.

Longer term we could investigate our own implementation of the underlying techniques used in ARKit and ARCore (visual-inertial odometry) so we can also incorpate the fish eye calibration into the techniques. We still have much to do on the original ZapBox proposition though, so we’re still focussed on that at the moment.


#5

Someone using ARCore https://www.youtube.com/watch?time_continue=29&v=J9RL3SYDcpk


#6

Thanks for the link Alan, nice to see another mini-golf implementation. What’s lacking with ARCore / ARKit is any form of physical controller interaction, so in that demo the club essentially attaches itself to the phone, and you have to move the phone to hit the ball. A neat idea to essentially turn the phone into the tracked controller, but I think the ZapBox approach of independent physical controllers opens more possibilities.


#7

The ZB approach is much better indeed!

It would be interesting to see if tracked controllers can be intergrated into ARCore/ARKit as they are focused on being markerless.


#8

Thanks!

Yeh indeed. As I said above, it’s actually the fish-eye lens that means we probably can’t use it directly in ZapBox anyway, but our own visual-inertial odometry implementation would be an interesting project. Even with that, the world markers may still be helpful in combination with it, as whether ARKit / ARCore / visual odometry in general work is dependent on image texture. Glass or plain coloured tables don’t work well for example.

One final issue with the current ARCore is there’s no API for getting at the camera frame on the CPU so we could do pointcode detection on it for controllers. I posted a feature request about that here: