What is the real world size limit for 3D models?


#1

Is it possible to have an 8.5" x 11" tracking image to display a large scale 3D model. Like having a statue size 3D model in your room?


#2

Hi @vu.v.lam,

The scale of your 3D content relative to your tracking image depends on how you’ve set them up in Studio

Whether the zapcode in the image below is 2cm in diameter or 20cm, the model will be scaled accordingly.

One thing to remember with tracking images is that, for your tracked content to display correctly, the entirety of the tracking image needs to be in the camera’s view.

Hope this helps.

All the best,
Seb


#3

Thank you Seb, this helps. So I would need need an 8 foot tracking image in order to display an 8 foot object and still keep the image within the camera view?

Is there another method using geolocation on a tracker and populate a room with 3D models? For instance, I would make a 20’x20’ room and add 3D models at various locations. As the viewer pans around the room, they will see the large 3D objects in AR.


#4

@vu.v.lam what your looking for is there ZapBox. But till they can do more the 10 tracking codes it wont work. There are many of us waiting on this as well.

Steve


#5

thanks, good to know.


#6

@vu.v.lam Not quite. The scale of your AR content is relative to your tracking image.

The tracking image wouldn’t need to be 1m tall to display a 1m tall model, as the relation between their scales will be defined by you within Studio.

However, as @stevesanerd mentioned, for this type of room-scale implementation you’d need to use our MR ZapBox kit, which has support for larger scale experiences.

You can find more information on ZapBox here.

Hope this helps.

All the best,
Seb


#7

@sebastian - I agree that the scale is set to the tracking image.
The question may be how do you break away from the tracking image and still get to experience the 3D in your space?

I find it tough to put language to describe it, but, I believe I’m getting the concept.

You trigger the experience with a code to establish x0y0z0.
But, then you then break away from the code as tracking image and travel through physical space while seeing virtual objects 3,6,10 feet around you in that space.

Everything I have made with 360’s uses me as x0y0z0 and follows me around.
Takes a bit of the fun out of it.

Just to be transparent: I’m pretty new at this and keep experimenting to get the most out of the product (zappar). I feel there are many here that are new to this technology and the more we can ask questions, the richer/helpful the forum will become.

Best,
BK!


#8

Hey @mydarndest,

That’s correct, currently, tracked Studio experiences treat the target image as the origin point content is relative to.

Real-world tracking, which would allow content to be placed in virtual 3D space, isn’t currently supported by the Zappar app, though our team are working on ARKit integration to allow for these types of experiences.

I agree, the more discussion we create in the forum, the better for our community :slight_smile:

All the best,
Seb


#9

Thanks @sebastian
It’s nice to know I’m on the right track and my FOMO is for not.
Onward toward real-world tracking in Zappar AR!
BK!


#10

thank you everyone, this was informative. Now to try and figure this out.