Streaming 360 Video - Too Soft



I have a project that’s streaming a video from Vimeo. The size of the video is 2048x1024.

For me, the video is a lot sharper when played on Vimeo than when its played within Zappar. It becomes very soft.

Is there any way to correct this?

Thank you



I suspect you’ll find this isn’t a problem with Zap at all - it’s more an intrinsic issue with 360 video in general. If you want the relatively small “window” you’re pointing the phone/HMD at to look sharp, that adds up to a humungous resolution when you consider you’re having to provide the whole 360°x180° frame, just in case the viewer does decide to turn round and look behind them. It’s hugely wasteful, but that’s the cost of giving the user freedom to look around :slight_smile:

If you’re watching a 360 video on Vimeo, the FOV - how much of that 360 sphere gets displayed in the frame - is probably much wider than the typical focal length used by a phone camera, so Zap is showing you a smaller part of the frame, hence it looks more blurry.

2048x1024 is very low resolution for a VR video (we tend to work at 6K and up for current-gen VR headsets), but given it has to stream to a phone you may not be able to push it much more than that. (That said, I’ve been able to stream a 4K square MP4 from Dropbox, but I don’t know how well that’ll work on older phones)

I know you can alter the FOV of rendered stuff in Zap with…), but I’m not sure this would work with 360 views.

An alternative, which may or may not work depending on your context, is to use less than a sphere. If your subject matter is concentrated in front of the viewer, or is only “above ground level”, use a hemisphere instead. Then at least you aren’t having to stream a ton of video data that never gets seen.

Edit: This has got me thinking. I don’t think there’s a way to adjust the FOV when you’re using the off-the-shelf PhotoSphere (someone pls correct me if this is wrong), but you could fake it with a bit of cunning scripting. I don’t have time to look at it right now, but in principle: 360 videos are displayed by mapping them to a sphere round the viewer. The effect is that the sphere is infinitely large, but in practice, as long as the sphere is kept centred on the camera, it’ll always look infinitely large. If you move the camera forward, the sphere moves by the same amount, so you can never get “nearer” to it.

So that’s the way it’s done: it’s not an infinitely large sphere, it may only be 10 or so “screen heights” in size. But it looks infinite. So If you were to offset the sphere forward, away from you - position still anchored to the camera but offset forward - the camera would capture more of the sphere. The effect would be a wider FOV, as if you’d zoomed out a bit. Which would have the side benefit of meaning more of the video’s pixels are now within the frame, so it’d be slightly sharper. *

I’d explore increasing the resolution and/or cutting the sphere/video down first, but this could be an avenue worth investigating. I’ll have a play when I have a free moment but that … may not be too soon (!)

(*) The reason I don’t think you can do this with a simple relativeTo setting is that you still need the sphere’s rotation to seem locked to the world - the inverse of however the gyro is reporting the phone’s current orientation.


Thanks for the great reply.

I did guess that it’s probably the resolution. I usually render 360 video (for another program) at 8192 wide. The project I’m currently working on is a 3D animation that is rendered at 4096, Vimeo then down scales it to 2048.

I won’t be able to try a different route this time around due to the project having to be complete by this Friday.

Anyway, thanks once again for your feedback.

All the best