The stereo-360 videos aren't really stereo
I'm working on an Unreal Engine project for VR that relies on some 360 stereo footage previously captured with Open Brush. However, to my dismay, I discovered that the videos produced by Open Brush have identical top/bottom frames! This can be easily verified in Photoshop by overlaying the halves of any frame. It doesn't provide a 3D view but rather just a 360 photosphere. The result in Unreal Engine is extremely disappointing. Upon closer examination of the generation process, I noticed that in the .usda file used by the batch process, there is this information: uniform token stereoRole = "mono" . I ran the batch for two modified .usda files (with parameters "left" and then "right" instead of "mono"), but once again, the frames came out identical (see the result in stereo-left-right.png).
However, in the next test, I numerically modified the camera coordinates to shift it 64mm to the right, and I obtained identical top/bottom frames between them but noticeably different from the previous file. I manually combined the frames (taking the top from the initial file and the bottom from the one with the right-shifted camera), and I obtained a view that, at least when looking forward, is what I need (see the result in stereo-manual.png). Unfortunately, this cannot be a solution because when looking backward, the eyes receive views that are simply inverted, as if you were looking cross-eyed. Multiple captures should be taken, gradually rotating the left-right cameras around a fixed point and combining the central strips of the resulting images. At least this is the solution described by the creators of the stereo-panoramic plugin for Unreal, where the results are indeed 3D.
https://www.unrealengine.com/en-US/tech-blog/capturing-stereoscopic-360-screenshots-videos-movies-unreal-engine-4
Unreal Engine
Capturing Stereoscopic 360 Screenshots and Movies from Unreal Engine 4
Here I'll walk you through Ninja Theory's particular settings and workflow for capturing 360 stereoscopic movies like the one we just launched today for Hellblade: Senua's Sacrifice. Read on to learn more about using the "Stereo Panoramic Movie Capture" feature which works out of the box in UE4.
29 Replies
OK. There's a lot to unpick here. I'll start with the simplest part. Does Open Brush render stereoscopic 360? I just did a quick test on the only render I had handy and there was a definitely difference between top and bottom as you would expect. However there wasn't much parallax in the scene so it wasn't really obvious. I'm doing an render now that will hopefully show the left/right shift more clearly.
As far as I know the .usda file is only used for storing the camera path. I'm not sure it's used for anything else.
@andreeaiosif I wanted to check how you're rendering your stereo 360? Are you doing an offline render using the .bat file? If you're doing something else - can you spell out the steps so I can investigate more fully?
OK. So here's a frame from a render I just did processed to show top/bottom differences:
If the top and bottom were identical that would be 50% gray all over.
(Out of interest here's how I did that:
1. Duplicated the layer
2. Shift it up by 2048 px so the top is exactly over the bottom
3. Inverted it
4. Set opacity to 50%
I guess I could have just used Difference mode but this is an old habit!)
To create the rendered image itself - I did an offline render using option 5:
Internally Open Brush does omnidirectional stero rendering which sounds the same as what you describe here:
>"Multiple captures should be taken, gradually rotating the left-right cameras around a fixed point and combining the central strips of the resulting images."
See https://developers.google.com/vr/jump/rendering-ods-content.pdf
In Open Brush, I recorded a video with a static camera, and then I edited the resulting .usda file to obtain a 6-second video with the camera oriented perfectly parallel to the axes: matrix4d xformOp:transform.timeSamples = {6.0: ( (-1.0, 0.0, 0.0, 0), (0.0, 1.0, 0.0, 0), (0.0, 0.0, -1.0, 0), (1.447783350944519, 2.4609344005584717, 12.606593132019043, 1) ), } (please see clean_21.usda file above). After that, I rendered it offline using HQ_Render.bat with option 5.
Can you try the normal workflow - record a camera path and then run the .bat file?
Let me try a static camera. I didn't even know that generated a .bat file.
OK, You have a significant difference here. Mine is almost gray:
"Almost" proves there's stereo
It would literally be 100% gray if they were the same view
If you're not getting much parallax, I wonder it's a scale issue. i.e. the scene is too big and nothing gets that close to the camera.
Just checking to see what information is read from the usd file
Yes, you are right. I'm also thinking of a scale problem. I didn't understand well how OB operates with the scale, but if it does its calculations on a huge scene in relation to the distance between the eyes, it could explain what is happening in my scene.
Yeah. Literally just camera pos, rotation and fov
for each frame
So - It might take the scene scale when you save into account
I'm not 100% sure from the code
Or it might be based on the regular "human" zoom level.
this was my case, that's how I did the render.
How can I find/modify this scale?
I might be wrong. It might not make a difference. What's your scene like at "normal" zoom level?
At regular human level is a big scene, but not huge. For example the tunnel has 4 meter height and the front statue, 12 m
Can you send me a frame when the camera is close to either of those?
A snapshot? Or a 360 stereo?
A 360 stereo rendered frame. I'm curious how much parallax is visible
Is your tilt file small enough to DM me?
excuse me for taking so long, suddenly OB no longer wants to record any .bat or .usda in Videos even though I have SaveCameraPath": true
I think that I have found the mistake: I blindly followed the recommendation from here: https://docs.openbrush.app/user-guide/exporting-videos#changing-eye-scale-on-ods-360-videos
and I changed ScaleEye with the value 0.1 to avoid clipping.
it's not ok to do that for a big scene, is it?
ah ha!
oh I'm so so sorry!
i didn't know about that
and i didn't spot that was even read from the usd file
so - that's the same as scaling the whole scene except for one thing - clipping
really we should scale the clipping values so they stay in sync with the scene scale - but this workaround looks ok
yeah - I need to add some info to that page that mentions the fact it will probably also flatten out stereoscopy
Forgive me for wasting your time, please! And thank you very much for help!
No problem. I learned a lot from this!
And we've improved the docs
LE: save the scene at the desired scale before doing offline render and make some tests with several EyeScale values.
Definitely not a waste of time. Useful information
Nearly got a resolution for this that allows you to create camera paths on the Quest (or other standalone headsets) and open them up on Mac/PC for rendering. Just fixing a few issues related to the Mac desktop build.