Android beta device support

On iOS we support a total of 6 devices for ZapBox (iPhone 6, 6S, 7 and plus variants). We’re actually able to test all of them in the office, which is handy. The app also works pretty well all the way back to iPhone 5 but we just don’t recommend them for ZapBox as the screens are a bit small for VR use.

On Android, our testing coverage is just a little bit smaller - the Play developer console claims the ZapBox beta app can be installed on 8811 different devices, and our device cupboard (and bank balance) isn’t big enough to allow us to test on all of them!

Android has got much better for offering consistent behaviour between devices over recent years, but when you get down to hardware components such as camera there is still significant variation.

In a future beta we’ll probably automatically prepare a report on the camera capabilities of your devices, along with asking for your help to calibrate the camera lens on your devices.

For now I’d be interested to hear just first-hand accounts of what works and doesn’t work on your devices.

The focus button should do the right thing on most devices (trigger an auto-focus action, then lock the focus). The “lock exposure” button will only work on devices where we’re using the new camera2 API, and when your device camera driver supports manual sensor settings. Camera2 was introduced in lollipop but there was an android bug with the colour data delivered to the CPU that means we only use it on marshmallow (android 6.x) and up.

Here’s some of the devices we’ve tested on ourselves so far:

Samsung Galaxy S6: Focus lock works, short exposure lock works, high sensitivity camera, 30 FPS

OnePlus 3: Focus lock works, short exposure lock works, camera a little dark for indoor scenes (max ISO a bit limited), 30 FPS

Sony Xperia XZ: Focus lock works, short exposure lock works, high sensitivity camera with some of the lowest latency we’ve seen, 30FPS

Google Pixel: Focus lock works. 60 FPS possible in auto-exposure mode. Entering short exposure lock gives nice short-exposure (good high ISO quality) image but annoyingly drops FPS back to 30.

I was hoping to be able to give the Pixel a full recommendation when I saw it offering 60 FPS through the standard camera2 API but annoyingly setting the sensor parameters manually drops it back to 30. The auto-exposure algorithm at 60FPS generally uses 1/100 exposure so it’s still pretty quick and not too blurred, so the additional smoothness from the 60FPS probably outweighs the advantages of manually setting an even shorter exposure to increase the speed of motion that can be tracked.

Interested to hear reports from other devices. It’s probably hard to tell if you’re seeing 30FPS or 60FPS without having something to compare it to, but you should be able to report if the buttons are working and how successful the high-sensitivity option is in different light environments, which would be helpful for us to know.

Thanks all!

1 Like

Tried on motorola g4 plus with the zapbox app from Gplay. couldn’t get the lens calibration to do anything sensible. Tried building map. Could get red targets but never managed to get any blues. focus lock doesn’t work and the other appears to do nothing.

Son’s iphone 6 worked ok. But android may be a lost cause pity.

Would of posted pictures but can’t find way to do that.

Pics on motorola g4 plus

One pic a a time…

Hi,

Thanks for your and sorry it wasn’t a great experience on your Moto G4.

For map building you’ll need to move more slowly than in my walkthrough with that device. You will need to slowly view the codes from different angles while keeping them tracking successfully. From the position in your photo a slow walk forward whilst rotating the device should do it. You can also go in closer to the codes individually and slowly rotate around them (imagine a bike wheel centred on the code sticking upwards, and having the phone on the outside of the wheel rotating around the code).

The focus lock button should work on all android devices - you might see a little focus hunt and then it should be locked. The easiest way to see if it’s working is to press it when looking at something close to the camera - then the camera should remain blurred if you look at more distant objects.

For actual use you’ll want to lock the focus more like 1-2 metres away, but the above is just interesting to see if that button works.

The exposure lock button is the more important one for good quality tracking right now, and unfortunately it sounds like the camera drivers on the Moto G4 don’t support the manual camera APIs. If that’s the case you should get a little pop-up at the bottom of the screen saying that when you try and tap the lock button - can you confirm if that’s the case?

Glad it worked ok on the iPhone 6. I wouldn’t completely write off android support yet - the experience is already pretty decent on devices that support the short exposure mode (albeit at 30 FPS for most of them).

We can definitely do better at tracking codes when they are blurred in the image, and that will help for devices that don’t offer manual controls. I agree with you that at the moment those devices don’t really give a good experience in anything but the brightest lighting (ie outdoor).

Got it to work on the G4. Moving extremely slowly and being outside did it.
Think the G4 may be a bit underpowered for zapbox. Phone got a bit hot, guess it’s working hard.
Will persist with the lens calibration to increase the field of view.

Thanks for the quick response.

Great, glad I could help a bit.

Right now the G4 is a bit underpowered but there’s a number of bit optimization opportunities still open in the code that should make it significantly faster. I’m hoping the G4 CPU will be OK once those are done.

I had similar problems with the Moto G3-- it does not seem to support the manual camera API. The video on map building helped a little, but not as much as the post here in the forum (which I didn’t see until after spending nearly an hour getting set up).

I suspect the light level was too low in the room I was testing in, as well, even though I had 3 lamps going. This probably also affected performance once I had the map built. Moving the headset or controller was pretty laggy. I do realize the Moto G3 isn’t a super-powerful phone, and I’m not complaining about the product! I was still able to finish the mini-golf game, even if I did get rather dizzy in the process. :dizzy_face:

It will be good to have more guidance about which kinds of phones will work well, in the future (chip specs, RAM, etc.) It would probably also be a good idea to have clearer directions about map building in the app, including staying with flat maps for now and describing the motion necessary to capture points. The lens calibration directions were quite good. Maybe it would also be good to try to get light levels through the camera and warn the user if the light levels are too low for reasonable response?

It’s fun to be part of this venture! :slight_smile:

My Zapbox arrived today. Lots of issues/questions:

  1. I’m using a 3 year old Samsung Galaxy Note 3 running Android 5.0
  2. I can install the zapbox app from the app store and when I open it, it appears to work normally. Every time I close the app and try to reopen it, it hangs on the intro screen. The only option is to uninstall it and reinstall the app, but then it just hangs the next time you try to reopen it.
  3. You really need steady hands to get through the lens calibration task!
  4. My phone doesn’t allow manual focus, but it seems like the auto focus is maybe sufficient. The mapping function seems to need really bright lighting!
  5. I have been unable to get successfully through the mapping. The first time, I tried with all ten of the markers spread out on my floor. I got 7 of the 10 to turn orange (my controller color), but couldn’t get the last three before the app crashed (the screen filled with what looks like an extremely closeup view of one of the markers with the whites bits red and the black bits black). So I tried again with just 7 markers (after having to uninstall and re-install the app and go through the lens calibration again). I could get all seven to turn orange, but the listing along the edge of the display showed 8 markers not just seven. Since nothing more happened after several minutes of panning around the area, I tried throwing an 8th marker out there. Suddenly, a ninth marker showed up in the list to be calibrated. I could never get the 8th marker to register. And after 10 minutes of panning across the area, I gave up. Of course, once I exited the app, it won’t let me passed the intro screen without reinstalling the app.

I am at a loss as to how to proceed, particularly with the issue of the app crashing whenever you reopen it.

Duane

Hi @dalton_zapworks ,

Sorry for the slow reply.

I’ve been surprised by how few android phones implement the manual API, my naive assumption was since it was added in Android 5 that most modern devices would now have switched to the newer camera driver interface, but it seems in practice many low and mid end devices are still using old drivers that expose the new API in “legacy” without manual controls :frowning:

Anyway there is a little light at the end of the tunnel - it seems many of these devices actually behave much better when using the old (now deprecated) camera API. Many devices I’ve tested offer a much lower latency feed through that API. Additionally they often expose some undocumented parameters that can be quite helpful - for example they often allow setting a high ISO manually which then shortens exposure and reduces blur for indoor environments. There’s also some hidden manual focus parameters for some of these devices too.

In my small test app I’ve got much better camera images out of the Moto G5 and G4 (we don’t have a G3 but I’d hope that would work better too), and can get both high ISO and manual focus which should give a much better ZapBox experience. They also report the currently used exposure time too, so we can automatically dial the ISO back if users are in naturally bright environments.

The problem is this is all undocumented, and therefore even less consistent across devices than the usual android camera API. That means we’ll have to either attempt to detect what works, or produce a UI to let users try various settings out for themselves and see the difference.

On minimum specs; there’s still optimization opportunities to explore, so I don’t want to recommend minimum specs yet until we see what improvements we can make there. Certainly the Moto G range is a target device for those optimizations.

I definitely agree we need more in-app guidance for setup. The short-term solution will probably be to film a more detailed video just on map-building and have a link to that in the UI.

Hi @duaneaa,

Sorry for the slow reply, and the number of problems you’re experiencing.

I haven’t heard any other reports of the freeze on the intro screen. Are you quitting the app the first time with the back button (which you need to press twice to quit), or are you killing it in some other way? I’ll see if our QA team can reproduce on any of our older Android devices.

Android devices that don’t support the manual API don’t currently work very well as they tend to use long exposures by default which gives too much blur. As I mentioned above it seems some older devices do expose ISO settings through the old camera API which improves things significantly. Hopefully that will also apply to your Note 3. We will also work on improving tracking when the image is blurred, but getting a less blurred image is always the first priority.

We’re reworking the calibration UI, and it will require fewer still frames to accept each view, so that should fix the need for super-steady hands.

What seems to be happening for you during map building is it is detecting spurious codes for a frame or two, which then increases the code count and leads to the problems you’re seeing, as it doesn’t let you build a map until all the codes it has seen are fully defined. The full-screen code is a bug that happens on the 11th “seen” code (which will be fixed in the next build) so that is just another case of the same thing.

We’re changing the map building UI to allow map building at any time with however many codes are fully defined at that point. That will mean spurious detections don’t matter as much.

You may rightly wonder why we get spurious detections at all. The reason is we wanted to keep the number of bits in the pointcodes small, so the bits themselves can be quite big and hence we can detect the codes from further away. The tradeoff is we don’t have as many bits dedicated to a “checksum” to confirm that detections are in fact valid. We do have a whitelist with the 50 possible “world” codes that we’ve assigned so far, but it might be that either due to shadows or blurs we actually detect one of your world pointcodes with an incorrect value in a particular frame. The other possibility is that there is some structure in your environment that also happens to look like a pointcode, and happens to get detected as one of the valid world code values in a frame.

Thanks so much for posting all of your issues in detail, and we’ll try and fix as many of them as we can for the next build.

Until then, and though I hate to say it, your best bet might be to try to get hold of or borrow an iOS device or an android device that supports the manual API for the short exposure mode. That will at least give you a taste of how the “true” ZapBox experience is supposed to work, as we continue the battle to deliver that good experience across a wider range of android devices.