Hi, I am trying to create 360 (gyro-oriented) experience in studio where I would be able to switch between several 360 photos by ‘looking’ into specific point. The idea is to have an object placed somewhere in 360 sphere and when user looks there (for predefined period of time), the photo scenes switch (to another 360 photo with another set of ‘pointers’). I am able to create scenes and switch between photos manually (or loop between scenes), but not sure how I could interact with the script based on the ‘current’ gyro position. This can be done (saw it on one of the experiences but I really can’t figure out how. Anyone?
360 video with interactive hotspots
Adding buttons to a project with Photo Sphere
Compass or directional functionality for 3D photo spheres?
That’s a cool experience, we’d love to see the final project when it’s done.
The AttitudeOrient node uses a devices gyroscope to continually rotate the node so that the content within appears stationary in the world around the user as they rotate the phone. It doesn’t however give any information on position.
I think what you’re looking for is a Raycaster set relativeTo Z.camera (to follow where the user is looking with their device) and a plane in the AttitudeOrient group positioned where you want your interaction to occur.
In the example project attached I’ve done as mentioned above and crudely positioned a plane over a TV in the example gyroscope project and have it change color to green when the raycaster is intersecting it, and return to white when it no longer intersects. This could be expanded upon using a timeline to play an animation and trigger an action at the end of it, which also helps avoid the case where the user accidentally looks at the object for a split second and doesn’t want to transition.
Example ZPP - raycaster-gyro.zpp (738.0 KB)
First, thank you very much for your efforts and detailed reply, this is exactly what I was looking for. Yesterday, after several experiments, I ended up with very similar setup (although I was triggering event through Raycaster/intersectionenter). The problem I had was how to position, scale and trigger the ‘plane’. You can’t obviously position it inside the ‘view’ (if I try to change position using green/red/blue coordinate arrows it just relocates itself randomly and shows a 3dimensional box as a plane - image attached). I can see in your example zpp that your plane is perfectly positioned and oriented, how this can be done?
No worries, happy to help.
The positioning of objects within an AttitudeOrient group can be a little fiddly but something you get used to.
It’s useful knowing that the photosphere has a default scale of [1,1,1] so changing the planes position from the properties panel to be around 0.8 or -0.8 in X or Z is helpful for making it visible, and then you can scale it down using the widget from there.
It’s best to use the individual widget arrows from there on in for fine tuning placement rather than clicking on the middle cube which repositions all 3. The properties panel is also helpful for moving an object along it’s axis without effecting others.
The experience works well now, but if I try to experience it through a headset, it has a strange result. When headsetManager is combined with cameraTransform, the strange thing happens, and left eye view is rotated 90deg to the left while right is 90deg to the right (screenshot attached). I can’t figure out where this could be adjusted (I tried searching through all docs in more details this time).
Hmm, it could be a device issue.
Can you attach your exported project as well as information on the phone you’re using and experiencing this issue with?
You’re correct, the issue lies within the CameraTransfrom being used when activating headset mode.
It’s hard to define what should happen in this case, probably the camera transform should be ignored, so what’s happening at the moment should be considered a bug.
The workaround for now is to set the relativeTo property of the attitude orient group to Z.camera when headset mode has been entered, and then back to the camera transform group when it is left.
You can see this in the following project - 360PanoWithHeadsetV2.zpp (2.0 MB)
Hi Mark - This thread was the closest thing I could find for a question that I have.
Is there a way to create an AR compass in Zappar ?
A client approached me and was asking if we can have an arrow appear that points to true north when the AR experience is activated. When the user moves his phone the arrow would rotate accordingly to always point to true north.
Is this possible? Please let me know if you have any ideas.
right now (or as far as I know) zappar doesn’t work with GPS data. If you need this to be real that maybe a problem. But if you want it to be fake that maybe able to be done.
ideally yes, it would have to be real… Specifically the ability to point to true north. Do you have any suggestions to cheat or fake it? Would GPS data be accessible though Universal AR and Unity?
Now that’s something I will have to look in to.
I would like to say yes you can just because you can in unity. But dont know at what cost. A custom app with zappar is a big ticket item.
I’ll have to look and see what GPS add on there are for unity.
Hi, thanks for responding.
Is this something you’d be interested in looking into? Do work for Zappar?
At this point I’m trying to learn if it’s even possible, and how much it would cost to program this feature so I can get an idea of a budget and get back to my client. Another key feature is that it would be developed for WebAR.
Please let me know what you think.