3D Environment Gyro Experience

Tracking Image

For our last ‘Augmented Portal’ competition, I put together an example project to help inspire you to create your own experience to enter the competition with.

I’m back with another one for this ‘360 Gyro’ competition, this time with the added bonus of being able to provide the zpp so you guys can explore the project yourselves.

The inspiration behind this example originally came from the fact that in our documentation and tutorials we cover photospheres and videos in an attitudeOrient but have never explicitly mentioned that 3D environments are also a great use case for 360 experiences.

This has been further reinforced by the great entries we’ve had from @jvouillon and @bradleycox255, so if you’ve been excited by their entries and want to see how an experience like that can be built check out the project.

You can scan the code above to see the experience (we made a nice image around the code but the experience isn’t tracked so don’t worry about printing it off) and you download the zpp project here - 360-3d-environment.zpp (17.8 MB)

You can also download all of the assets used in this experience here - 360 3D Assets.zip (25.2 MB)

See below for a description of the idea, experience, and project. Any questions feel free to ask :slight_smile:

The Idea

The Zappar Bunny, one of Zappar’s older 3D mascots, once had an experience that let you see into his 2D house. We used this idea of his house as the basis for this 3D version. It also gave us an opportunity to create 3D models for the experience that would fit together and we could share with you.

The Experience

After scanning the code the screen will flash white and you’ll enter the house through the front door. You can then use your phone to look around the fully 3D environment.

There are a couple of points of interactivity, you can tap on the bunny to have him play a waving animation and there is one more element you can tap on but I’m going to let you discover that one for yourselves :wink: (hint: it’s glowing)

The Project

To enter the house in-scene you can play the ‘world > intro’ timeline and then use the gyro widget in the bottom left of the 3D view to rotate around.


The majority of the project is made up of 3D models (made by our very own @joe) placed within groups within an attitudeOrient node. The model’s that are interactive have their own group in which an invisible plane is placed in front of them to record pointerdown events and trigger their interaction.

The ‘Forest’ that appears outside the window is made up of 2D images placed at different positions behind the house to give the sense of depth and two of the ‘lighting’ images are set to fade in and out in the ‘forest > glow’ timeline.

There is a tracking image in the project but it is solely used for the “seen” event to trigger the intro animation and no content is tracked from it.

There is also a cameraTransform node that is used to alter the focalLength of the experience so that objects do not appear too close to the camera.


Ah! The focalLength, a great addition I was missing. Thanks Mark!

Careful, the cameraTransform does not currently play well with headsetmode. When entering headset mode you will want to disable the cameraTransform by setting the relativeTo property of the attitudeOrient back to Z.camera.

1 Like

Yes, I have seen a previous thread about this issue. Thanks Mark.

Good stuff!

I was planning on doing this for the compo, but not sure if I’ll find the time to finish it. :roll_eyes:

Perhaps the next step is to add on screen controls for offsetting the attitudeOrient and allowing movement in the scene? Don’t know if this would be too confusing as the users might expect being able to move with the gyro alone :thinking:

1 Like

Having hotspots to move through the environment was a consideration at the beginning but given the size of the eventual room, we felt it was a little unnecessary. Certainly recommended for larger environments.

a question you can use an image for 360º but without using the gyroscope sensor function, because not all mobiles have that technology since thank you very much

Hi there,

There’s no built-in method to do this without the gryoscope, but you could in theory recreate this by using swipe functionality to rotate the image around the camera.

Hope this helps.

All the best,

1 Like

there is some tutorial to be able to see from now thank you very much

We don’t have a tutorial or example to share at this point in time, but our documentation on pointermove events is a good place for you to start.

As mentioned by Mark in a different topic:

In general, you’re looking to track the x and y position of the primary pointer as you mentioned and then use a threshold value (minimum distance that needs to be covered) to determine if the user has swiped or not.

You can use a variable to store the last x and y position of the primary pointer and use that to compare to the current position to be able to determine the difference and compare it to the threshold value you set.

Hope this helps.


1 Like

Thank you very much and I’ll see that documentation and I apologize for my English, it’s a shame almost 70% of the mobile gyroscope sensor function, thank you very much Seb for your attention