I guess it would remove the need for positional trackers, but would still draw benefits from being able to use the marker controllers / other object markers. Is this in the pipeline?
Good question. It’s certainly something we’re thinking about.
For ZapBox we have the fish-eye lens adapter which I suspect would prevent the tracking in ARKit from functioning correctly.
So nothing to announce right now in terms of commitments but it is on our radar
Hi Simon, I’m interested to hear what you think about ARCore’s development for Android. Could ZapBox use ARKit for iOS and ARCore for Android? From my limited understanding, the fish eye lens adapter helps with tracking the pointcodes. Would ARKit and ARCore eliminate the need for pointcodes and fish eye lens?
We could certainly make use of ARCore on android, will take some investigation into whether there’s a common feature set supported across both ARKit and ARCore but it seems Google designed their API with compatibility in mind.
In terms of ZapBox the fish-eye lens is important to increase the field of view for interaction with the controllers, and also to enlarge the camera view we can render in stereo - without the fish eye lens it feels a bit letterboxed. Neither ARKit or ARCore would work with that lens (I expect) so for ZapBox in the short term pointcpdes will still be needed when using the fish eye lens.
Longer term we could investigate our own implementation of the underlying techniques used in ARKit and ARCore (visual-inertial odometry) so we can also incorpate the fish eye calibration into the techniques. We still have much to do on the original ZapBox proposition though, so we’re still focussed on that at the moment.
Someone using ARCore https://www.youtube.com/watch?time_continue=29&v=J9RL3SYDcpk
Thanks for the link Alan, nice to see another mini-golf implementation. What’s lacking with ARCore / ARKit is any form of physical controller interaction, so in that demo the club essentially attaches itself to the phone, and you have to move the phone to hit the ball. A neat idea to essentially turn the phone into the tracked controller, but I think the ZapBox approach of independent physical controllers opens more possibilities.
The ZB approach is much better indeed!
It would be interesting to see if tracked controllers can be intergrated into ARCore/ARKit as they are focused on being markerless.
Yeh indeed. As I said above, it’s actually the fish-eye lens that means we probably can’t use it directly in ZapBox anyway, but our own visual-inertial odometry implementation would be an interesting project. Even with that, the world markers may still be helpful in combination with it, as whether ARKit / ARCore / visual odometry in general work is dependent on image texture. Glass or plain coloured tables don’t work well for example.
One final issue with the current ARCore is there’s no API for getting at the camera frame on the CPU so we could do pointcode detection on it for controllers. I posted a feature request about that here:
As it’s the only topic speaking about ARKit / ARCore, let’s use it.
I’m using Studio, and my clients would love to have experiences that allow the user to walk in the real world, while seeing AR content added on top of it.
For example being able to walk around a table, while seeing a 3D character dancing on top of the table, another character climbing a wall behind me, and a spider walking on the roof (planes detection).
It would not need any target image anymore, the user would be free.
Do you think this could happen on Zappar Studio in 2018 ?
ARKit and ARCore integration is something we’ve been looking into, though we don’t currently have any further information to share on this functionality just yet.
We’ll post any updates on the forum so keep an eye out for more info
All the best,
Thank you Seb.