Monoscopic 360 rendering
Please add a "monoscopic 360 rendering" between the existing options for offline camera path rendering. ODS rendering is not always the most suitable option for certain tasks, and it takes too long to render stereo just for take half of the top-bottom resulting clip.
3 Replies
I would add here the fact that, with the assistance of ChatGPT, I have built my own 360 monoscopic python rendering tool. It is functional, and I can make it available to the community if I receive a message to that effect. The tool does the following:
1/ It takes the input.usda file generated in OpenBrush with VideoCameraPath and reconstructs it with FOV=90 and the Y-axis stabilized vertically (with X and Z minimally rotated to maintain orthogonality). The result is an output file named front.usda;
2/ Starting from front.usda, it generates five other .usda files corresponding to the views left, right, up, down, and bottom;
3/ Renders all six usda files using the renderCameraPath tool from OpenBrush, at a resolution of 3641x2048 px;
4/ Crops the six previous films into a square format, with a side of 2048 px (this step was necessary due to the constrained aspect ratio of 16:9).
5/ Stitches them into a standard 3x2 cubemap;
6/ Converts the cubemap-type film into equirectangular.
(Steps 4-6 use ffmpeg).
I would definitely suggest making it available! Even if we end up re-implementing or incorporating it then it might still be superior in some ways to whatever we do.
Thank you for the kind words! I'm really glad to hear that my tool could be of use to the community. It was a fun project to work on with the help of ChatGPT. I would be immensely happy to know that I have also contributed something to this wonderful collective effort that is OpenBrush. You have the archive attached. Feel free to use it however you consider. If it's not too much trouble, I would love to hear if my contribution has been helpful in any way.