ZapBox Android Roadmap

The ZapBox 2.1 Android release included a good set of improvements and enlarged the set of well-supported devices that deliver 60 fps camera feed with a short exposure mode. Read more about it in ZapBox 2.1 for Android - Release Notes.

On devices like the Samsung Galaxy S8 and the Google Pixel 2 the experience is now close to that offered on iOS devices.

There is however more to do, both to reach true parity with iOS on well-supported devices and also to bring more devices into that well-supported list. Here are the headlines for updates that are still to come.

Settings Overlay UI

The iOS app has a “cog” button that gives a more detailed settings overlay UI, originally only for the camera but since extended to offer a few other options relating to tracking and rendering.

This settings screen was intended as a temporary measure to help out during development but in practice is useful in certain situations, such as for setting the manual camera focus position.

We’re working on something similar on the Android side. Exposing a UI to enable tweaking is actually more important on Android as there are far more differences between devices. Some options may give a great experience on some devices but not work everywhere, so in those cases having a toggle in a settings UI will let users experiment to find what works best for their device.

The long-term goal is to make the best options the default where we can, but given all the device differences on Android we’ll need to expose some tweakable options first. By collecting user feedback we can then set device-specific defaults to allow more users on more devices to get a good experience from the first time they open the app.

Gyro Filtering

There is one feature in 2.1 that is enabled on iOS and not yet on Android, which is to use the gyroscope sensor to help constrain the estimate for tracking the world pointcodes. Using this sensor data makes world tracking smoother and more accurate, especially when there are not very many world pointcodes visible or when they are viewed close to straight on.

The reason this feature is not yet enabled on Android is (guess what!) due to differences between devices. Not all of the Android camera pipelines supply timestamps with the camera frames, and there is also no guarantee that the timestamps can be directly compared between gyroscope sensor readings and the camera frames.

However we do expect in many cases the timestamps will be available and comparable. Once there’s a settings UI we can expose this as an option than can be toggled by users.

That will then deliver genuine parity for ZapBox tracking between iOS and at least some Android devices.

Better Exposure and Focus Control

Just like on iOS we will expose a manual focus slider in the settings UI to replace the focus lock toolbar button. That will remove the hassle of having to re-lock the focus every time the camera is started.

The short exposure mode will be replaced with manual exposure and ISO sliders in the settings UI. That will give much more control to users and also avoid the hassle of re-setting the mode every time the camera is started.

In the longer-term we will implement our own auto-exposure algorithm that will prioritize low blur over pixel noise. This will be the default mode and should give a great experience out-of-the-box for new users with supported devices.

Support for High-Speed Capture Modes

The “vrmode” camera got us to 60 fps on most high-end Samsung devices from the last 3 years or so, but it was exposed through the legacy camera API and it is unavailable on the newest Samsung Note 9 and S9 devices.

All hope is not lost for getting 60 fps from these devices. The current Android Camera2 API has a separate concept of a “high speed” capture mode, which is really aimed at slow-motion recording use cases with very high frame-rates. However we have noticed that many devices will deliver the preview camera stream at 60 fps in this mode, even when no video is being recorded. If you have a modern high-end phone then try going to the slow motion mode in your main camera app – if the preview looks smoother than in the normal mode, even before you start recording, then it is likely your device is one of those.

Previously we have not been able to make use of this mode as it doesn’t support CPU callbacks for running our tracking code on the images. However the “vrmode” integration has involved adding the ability to read-back frames provided directly as GPU textures. That puts this option back on the table.

The documentation states that the high-speed mode does not support any manual control but at least on the Note 9 these controls do still work (another great example of the sort of device-specific behavior that makes Android so frustrating!). With the settings UI users will be able to play with what undocumented combinations of options might be supported on their device.

Tweaks for Legacy Cameras

There are unfortunately still many devices (mainly old ones or current lower-end ones) that don’t expose manual camera controls through the official Camera2 API.

ZapBox does really need manual control to offer good tracking at the moment. We are working on improving tracking of blurred codes in general to improve the situation somewhat, but it is always the case that having manual control of the exposure time will offer a better ZapBox experience.

All hope is not lost for devices that don’t expose the official manual API – often they do still have a “manual” mode in their default camera app that allows setting the ISO and exposure time. On those devices it may be possible for us to achieve the same control in the ZapBox app by setting the right combination of undocumented and hidden camera parameters.

This is another case where a settings UI will be required as it will take some trial-and-error to find the right options for specific devices.

Reducing Latency

Android actually offers better control of latency than iOS due to Google’s heavy investment in VR. There are direct front-buffer rendering modes that can be used in the compositor, and ways to reduce the number of rendered frames held in intermediate buffers when not using that mode. Enabling these features will require a significant amount of work in the ZapBox app, but should offer a noticeable and worthwhile latency reduction.

We also have more general plans to try out using gyro data to correct for the inevitable latency in the camera pipeline – so if you rotate quickly, we can effectively rotate the camera rendering in the other direction to compensate. This is again an area where Android has the potential to offer lower latency than iOS as there are new low-latency sensor APIs designed specifically around minimizing motion-to-photon latency in pure orientation-only VR cases.

Conclusion

It’s been a much more difficult road than anticipated to get the Android build to where it is now. The 2.1 app finally offers a good experience across a relatively wide range of common Android devices and I’m pretty happy with it in general. I won’t be completely happy with it until the camera changes above are in place and we can genuinely say we have done everything possible to get the most we can out of every device.

The good news is the future of ZapBox on Android is looking bright. Given the potential for better control of latency in Android, and the generally higher screen resolutions available there, I can actually foresee a day when I’ll be reaching for an Android device when doing ZapBox demos!

1 Like