Being able to instantly track something to the floor, or a desk, is ace. It needs to be something with some texture to it for the vision tech to work well, though, and it would be nice to be able to direct the user elsewhere if they’re pointing at something too bland.
@simon - within the instant tracking code, is there any simple “quality of surface for tracking” metric we can have access to please? That way, if a user points at something flat and featureless but tries to initiate the experience, we can put a friendly message up to direct them toward something more textured. We could poll a function, perhaps, during the initial acquisition stage.
The current workaround is to just put instruction text up saying “point at the floor or a desk - a textured surface”, but we’ve no way to tell whether the surface actually is going to be any good. Ideally I’d like to be able to differentiate between these good and bad examples so I can selectively put a message up if necessary: