We’ve just pushed the button to release ZapBox version 2.1 to the Play Store. It should be available within the next couple of hours if it isn’t visible for you already.
Here are some notes to help you have the best experience possible with the latest Android app version.
You should follow these steps every time you want to start a ZapBox experience.
1. Set Focus Lock using the AF button
- You should point the camera towards something a couple of meters away and tap the “AF” button to trigger an auto-focus search and focus lock.
- Do this AF lock process every time the camera starts. The camera is stopped when the main menu is on screen so you need to lock the focus again each time you go into a new screen that involves a camera view (Calibrate Camera, Build Map, Scan Code, or when an experience is loaded from the Launch menu).
- If you’re going to the Build Map mode, it’s best to start it pointing away from any of your world pointcodes to get the focus lock established. This will improve the accuracy of the map, as all the views of codes used for building the map will have consistent focus.
2. Attempt to use “Short Exposure Mode” with the padlock button
- When the focus is locked correctly, and the lighting for the map area looks reasonable, you should tap the padlock toolbar button.
- Exactly what this will do depends on your device; more detail in the separate section on camera types below.
- If your device supports this mode you should do this every time the camera starts (after the AF process above).
Lens Adapter / Calibration
I’d recommend using ZapBox without the lens adapter at first to see how well your device is supported.
You should go through the Calibrate Camera process even if you want to use the app without the lens adapter. The UI suggests it’s optional in the “no lens” case, but many android devices don’t report the camera field-of-view accurately so it’s always best to manually calibrate. Remember to do the AF lock procedure above first, and set the focus a couple of meters away, so you calibrate the lens at a similar focus position to what you will use in practice.
Once ZapBox works well on your device without the lens adapter then you can try with the adapter attached to get the benefits of a wider field of view. Position it carefully in the centre of the camera, and make sure it sits flat on the device. You may need to add some cardboard padding to the clip to get the adapter to sit flat.
I often open up the main camera app on the device to see the image quality through the lens when getting it in place - sometimes a small movement gives a dramatic improvement in image quality. When you’re happy with the quality then fire up the ZapBox app again, set the lens slider to “on” and go through the Calibrate Camera flow.
The calibration results are saved so you should only need to do the calibration once (well, once each for “lens” and “no lens”). However given the variation in placement of the adapter I do sometimes re-calibrate when I reattach the adapter.
For usable tracking this build still requires a device that supports some manual control of the camera. There are 3 different categories of devices that offer different levels of control (and correspondingly different levels of experience quality).
Samsung “vrmode” devices
Samsung devices that expose the “vrmode” through the legacy camera API will use that mode now. That gives a 60 fps camera feed and also supports manually setting the ISO. This build hardcodes ISO to 800 for those devices to give short exposure times in well-lit environments (and hence low blur and better tracking).
These devices should now work quite well with ZapBox. The camera latency is still a little higher than we’d like although there is some scope for us to make further improvements here.
We don’t know the full set of devices that expose this mode but certainly the Galaxy S6, S7 and S8 do so. I believe most recent Note devices expose this too. You can tell if this mode is being used by pressing the padlock toolbar button, which will display “short exposure lock not required on your device” if this mode is in use.
Unfortunately as this was exposed through the legacy camera interface it appears the newest Samsung devices (S9 and Note 9) do not offer this mode. They do however support manual settings through the “Camera2” API; see below.
Camera2 devices that support manual controls
Most modern high-end devices expose manual camera controls through the current Android camera API (known as Camera2).
On these devices tapping the padlock button will pop up a message saying “Locked camera in short exposure mode”. After that you should see significantly less blur which leads to much better tracking.
Unfortunately the majority of devices in this category still only expose a 30 fps camera feed through this API. For those devices tracking works well in the short exposure mode, but the overall quality of the experience is impacted somewhat by the lower frame-rate.
There are a few devices that expose 60 fps feeds through this mode, such as the Google Pixel and Pixel 2. The original Pixel has a bug where setting manual parameters (i.e. the “short exposure mode” in ZapBox) drops the feed back to 30 fps. The Pixel 2 supports both short exposure and 60 fps feeds, so that is a device that works well with ZapBox overall.
Unfortunately if your device does not support either of the two modes mentioned above then you are unlikely to have a good experience with the current version of the ZapBox app.
Your device is in this category if you see the message “Unfortunately your device doesn’t support manual camera settings” when you tap the padlock button.
The best bet for now in this case is to try using the app in very bright environments (i.e. outside) where the default auto-exposure algorithm may select shorter exposure times, which will give less blur and better tracking.
I hope those of you with well-supported devices enjoy the updated app. We’re still working hard to improve device support, but I’ll post another thread shortly with more details on that as this one has already got too long.