There’s a couple of different things at play here. Let me try to explain.
1. Zapworks Studio 3D import scale
When you use the 3D import wizard in ZapWorks Studio, by default it will fit the model in the 2x2x2 bounding box as mentioned above. To maintain the units from the original file you should uncheck the “Auto Recenter Model” option in the Scale panel of the import wizard, and switch to “Zoom” rather than “Scale to Fit” and set the zoom value to 100%. The import wizard should look something like this:
You’ll also need to make sure your 3D package isn’t applying any scaling when exporting, and that you know the native units of your model. You probably want to use meters as the native units if possible.
You can always check your units are as expected in Studio by measuring against a plane object. Planes go from -1 to +1 so are 2 units in width and height. So if you think your model should be 1.5m wide and native units are meters, you can drag a plane next to it in the hierarchy, set scale to 0.75, and check it matches the width of your model.
2. World Tracking units
If you’re using World Tracking in the native apps, that will be backed by ARKit and ARCore. In that case the units reported by the World Tracker are all in meters. So, for example calling
anchor.setTransformFromCamera(wt, [0, 0, -1]); on a custom anchor would place it directly in front of the camera, 1 meter away.
Usually anchors are placed based on raycasting against the ground plane or other detected planes in the world - those detected planes (and hence the raycast distance) will also all be reported in meters.
3. Instant Tracking units
If you’re using Instant Tracking (ie our current World Tracking in the web solutions) then absolute units are not actually available. From a single camera it’s not possible to calculate absolute scale of things in the world. It’s only possible by combining the camera images with accelerometer data, which we’re working on for the web, but has a lot of additional challenges over the native APIs.
So how is scale defined in Instant Tracking? It actually comes from whenever the anchor is initially placed or reset relative to the camera. You specify a position relative to the camera, and Instant Tracking then assumes that the texture in that direction in the camera is at the depth you’ve specified. So if you pass [0, 0, -1] for the distance then the anchor is 1 “unit” away from the camera. If the texture in the environment happens to be 1m away then units are meters, but if it’s 10cm away then now 1 unit = 10 cm.
You can think of Instant Tracking as a vaguely similar to the Z.camera transform - if you put an object at [0, 0, -1] relative to Z.camera then it will be the same size on screen as the initial size of an Instant Tracked anchor placed at [0, 0, -1]. The difference is the Instant Tracker will attempt to keep the content attached to the environment as the camera moves.
If you’re targeting the web, right now there’s no way to know the absolute scale of the camera motion and world features, and so no way to set models up at exact size, everything ends up in relative units based on the distance the anchors are placed from the camera.
If you’re targeting world tracking in the Zappar apps or a Zappar embed component in another app, then you can make your models the correct absolute scale, you just need to be careful about your 3D package export settings and the ZapWorks Studio import auto-scale settings.
Hope that helps clear things up!
As soon as we have a way to determine absolute scale that works on the web then I’ll try to remember to update this thread too.