Dear visitor, welcome!

This week I visited the Blender Institute and decided to wrap up the multiview project. But since I had an Oculus DK2 with me I decided to patch multiview to support Virtual Reality gadgets. Cool, right?

Oculus DK2 and Gooseberry

Gooseberry Benchmark viewed with an Oculus DK2

There is something tricky about them. You can’t just render a pair of panoramas and expect them to work. The image would work great for the virtual objects in front of you, but it would have the stereo eyes swapped when you look at behind you.

How to solve that? Do you remember the 3D Fulldome Teaser? Well, the technique is the exactly same one. We start by determining an interocular distance and a convergence distance based on the stereo depth we want to convey. From there the software (Cycles) will rotate a ‘virtual’ stereo camera pair for each pixel to be rendered, so that both cameras’ rays converge at the specified distance.


Oculus barrel correction screen shader applied to a view inside the panorama

This may sound complicated, but it’s all done under the hood. If you want to read more about this technique I recommend this paper from Paul Bourke on Synthetic stereoscopic panoramic images. The paper is from 2006 so there is nothing new under the Sun.

If you have an Oculus DK2 or similar device, you can grab the final image below to play with. I used Whirligig to visualize the stereo panorama, but there are other alternatives out there.

Gooseberry Benchmark Panorama

Top-Bottom Spherical Stereo Equiretangular Panorama – click to save the original image

This image was generated with a spin off branch of multiview named Multiview Spherical Stereo. I’m still looking for a industry standard name for this method. But in the meanwhile that name is growing on me.

I would also like to remark the relevance of Open projects such as Gooseberry. The always warm-welcoming Gooseberry team just released their benchmark file, which I ended up using for those tests. To be able to get a production quality shot and run whatever multi-vr-pano-full-thing you may think of is priceless.


If you want to try to render your own Spherical Stereo Panoramas, I built the patch for the three main platforms.

* Don’t get frustrated if the links are dead. As soon as this feature is officially supported by Blender I will remove them. So if that’s the case, get a new Blender.

How to render in three steps

  1. Enable ‘Views’ in the Render Layer panel
  2. Change camera to panorama
  3. Panorama type to Equirectangular

And leave ‘Spherical Stereo’ marked (it’s on by default at the moment). Remember to post in the comments the work you did with it!


Last and perhaps least is the small demo video above. The experience of seeing a 3D set doesn’t translate well for the video. But I can guarantee you that the overall impression from the Gooseberry team was super positive.

Also, this particular feature was the exact reason I was moved towards implementing multiview in Blender. All I wanted was to be able to render stereo content for fulldomes with Blender. In order to do that, I had to design a proper 3D stereoscopic pipeline for it.

What started as a personal project in 2013 ended up being embraced by the Blender Foundation in 2014, which supported me for a 2-month work period at the Blender Institute via the Development Fund. And now in 2015, so close to the Multiview completion, we finally get the icing on the cake.

No, wait … the cake is a lie!


  • Multiview Spherical Stereo branch [link] *
  • Multiview: Cycles Spherical Stereo Support Official Patch [link] *
  • Gooseberry Production Benchmark File [link]
  • Support the Gooseberry project by signing up in the Blender Cloud [link]
  • Support further Blender Development by joining the Development Fund [link]

* Time traveller from the future, hi! If the branch doesn’t exist anymore, it means that the work was merged into master.

Nice Oculus

Thanks! This is not mine though 🙂 Oculus is one of the supported platforms of the Blender-VR project, to be presented at the IEEEVR 2015 next week.

If you are interesting in interactive virtual reality and need an open source solution for your CAVE, multiple Oculus or video wall, give Blender-VR a visit. I’m participating in the development of a framework built on top of the Blender Game Engine.

Also if Oculus feels like sending me my own Oculus, I wouldn’t mind. If you do, though, consider sending one to the Blender Foundation as well. I will feel bad when I take the device away from them next week.

Have a good one,


Due to the long review process the patch is not yet in Blender. That said, since there were enough people interested on this feature, I just updated the links above with a more recent build (on top of current Blender 2.76 RC3).


The build now also supports regular perspective cameras. This is required for cube map vr renders. For this I also recommend an addon that I was commissioned to build, to render or to simply setup cubemap renders [link].

Note: remember to change your camera pivot to center.

* Last build update: October 2nd 2015

57 Thoughts on “Gooseberry + Multiview + Oculus = Mind Blown

  1. Hello. Thanx for this method, but i cannot render stereopair like you show. It always render only one equirectangular image and i dont see ‘Spherical Stereo’ anywhere.

  2. I finally found how to render it! My first Blender render so far ) But here another question, Right now this is only available for Cycles Render, what about Blender Internal renderer? I’m asking because i want render some volumetric stuff and particles, but looks like Cycles dont support this yet.

    • dfelinto on March 22, 2015 at 3:43 pm said:

      This method (not multiview, but this spherical stereo method) is Cycles only since it requires a per-pixel algorithm.
      The only way of having the equivalent (so you could perhaps mix with Blender Internal) is to render strips of render, and combine them separately. You can find about that in this thread [link]. Maybe you can ask there for people to share their script.

      • Thanx a lot for info. Do you have any links where i can read about this perpixel algorithm? I wish try implement this into Unity3d. Maybe point me where in Blender sources to explore? Or something else.

        • dfelinto on March 27, 2015 at 10:58 pm said:

          For Unity you can just render in realtime the current point of view. Why bother with producing a whole panorama at once?

          That said you could do that with vertical strips (see the link above). But that means rendering the image multiple times.

          As for the per pixel algorithm it’s in the kernel_projection.h changes present in the patch [link].

    • Hi Dennis

      I seem to be having the same problem you had, I am only getting one image. Could you give me some tips on how you did it please?

      Thanks a lot.


  3. Pingback: Viewing the Gooseberry Benchmark scene on an Oculus DK2

  4. You write “Enable ‘Views’ in the scene panel”
    but it is on Render Layers panel

  5. Kevin on March 24, 2015 at 3:33 am said:

    Good gravy I hadn’t heard of this or the Blender VR project until now, but I’ve been dreaming of VR headset integration with Blender ever since I heard of the Oculus Rift. This is so exciting!

  6. If you have Janus VR someone converted your example into a 3D skybox you can view with Janus. Here’s the link: and the discussion on Reddit here:

  7. really excited about this new features.
    Here is a quick test scene I came up with

    A strange thing is that I export the image as LR. But when I view this image with LiveViewRift, I feel more comfortable when view this as RL.
    I think it could be that the convergence distance being too close?
    I will need to do more testing and get familiar with the setting.

    Definitely looking forward to rendering an animation with this.

    • dfelinto on March 27, 2015 at 11:03 pm said:

      I ran into this issue as well (LR -> RL), I didn’t run further tests though. It should be easy to fix, but it may as well just be the expected result.

  8. “Panorama type to Equirectangular”
    Where is this option under? I can’t find it anywhere.

    • dfelinto on March 27, 2015 at 10:46 pm said:

      After you set your camera to be ‘Panoramic’ the next option is the panorama type (the default value being ‘Fisheye Equisolid’)

  9. This is great. I see a few things missing in the stereo side of things, but so much is there, it’s great!
    A near / far clipping plane that shows up to aid you in visualizing when things get too far or too near (and get farther than a specified max disparity, like 1/30th or 1/100th of picture size)
    The ability to turn OFF stereo.
    Rendering both stereo cameras in cycles preview
    Turning a 2 frame motion tracked stereo pair -> stereo camera feature. (probably a lot of work, but would save trillions of dollars every year for matching stereo cameras)

    Anyways, thanks so much for working on this! It’s super fun.

    • dfelinto on March 27, 2015 at 10:53 pm said:

      > The ability to turn OFF stereo.
      Well you can disable the ‘Views’ option in your scene, but I guess you want to stop previewing everything in stereo. In the Viewport properties you can choice to see only one of the views.

      > A near / far clipping plane that shows up to aid you in visualizing when things get too far or too near (and get farther than a specified max disparity, like 1/30th or 1/100th of picture size)

      The ‘Stereoscopy’ panel in the Viewport properties has the ‘Plane’ and ‘Volume’ options. They should help you (though none of them work on a pixel separation base).
      My idea is that people can expand the UI to drive the values based on pixel separation, and maybe in the future we integrate that in the official interface.

      > Rendering both stereo cameras in cycles preview

      This is in the far-fetched todo list, but not as simple as I would like it to be 😉

      > Turning a 2 frame motion tracked stereo pair -> stereo camera feature. (probably a lot of work, but would save trillions of dollars every year for matching stereo cameras)

      Mind clarifying that?

      > Anyways, thanks so much for working on this! It’s super fun.

      You’re welcome, thanks for testing it 🙂

      • > Turning a 2 frame motion tracked stereo pair -> stereo camera feature.

        You’d take your stereo pair, and motion track (match move) the camera. Even if it’s two separate lenses, it’ll work. In a scene with enough tracking data, you would be able to get an exact interaxial, and what model the cameras are using (toe-in / parallel). It would aid in matching a scene with a real world camera.

        This is the blender 3D plugin I used to use, and it had full near / far plane support. You might be able to re-use some of it’s code.

        > The ability to turn OFF stereo. <

        It would be really nice to have a stereo-off option in the D menu. (also, how do I activate grease pencil now?)

  10. Maciej on April 27, 2015 at 6:16 pm said:

    Hi guys – can I download Multiview branch build for windows somewhere? I’ve checked Graphicall and couldn’t find any. I’m not that keen on building it myself but would love to test stereoscopic spherical panoramas.


  11. Hello Dalai Felinto,
    I work for a company using 3D Vision stereo for aerial images, using them to measure 3D data with photogrammetric software.
    I like your effort in Blender tryining to integrate different ways of stereo.
    We run everything on quad-buffered stereo, but I like your ideas of Oculus :-).
    I try to follow you and to see what is coming up.

    All the best to you,

    • dfelinto on July 1, 2015 at 5:03 pm said:

      Hi Tom,

      I love your company’s work by the way. It’s flattering to know that you’re considering to use part of my work in your pipeline.

      > We run everything on quad-buffered stereo

      Have you tried your system with Blender 2.75 yet? Quadbuffer is fully supported at the moment.

      Also did you see my experimental VR Viewport addon?

      It was developed for fun initially, but nothing stops it from being used for real work.

      And by the way, I’m currently using Blender Game Engine for visualizing aerial captured geometry, and it looks beautiful.

      Take a look at my latest post (on Planovision) to see the device. This is Blender Game Engine related, more than Blender itself, but I think you may like it.

  12. hey dalai,
    first of all: thank you, that you work on the implementation of such great feature… most people still have no idea how absolutly amazing a stereo 360 image / animation looks like.

    you wrote that “Any recent build of Blender already has multiview” – this is correct but not “Multiview Spherical Stereo branch”, right? I downloaded the last builds (ghooseberry branch and experimental) but there is no ‘Spherical Stereo’ option. Do you know when this feature will be in the official builds?

    with best regards

    • dfelinto on July 1, 2015 at 4:53 pm said:

      Hi Simeon,
      You are correct.

      Multiview is in the upcoming 2.75, but I still have to finish the “Spherical Stereo” implementation.

      I should do it shortly, but as far as official builds go, you will have to wait until 2.76. That said, once the feature is committed, it should be easy for you to get a semi-official build with this.

      What I can do is to provide an updated build for all platforms as soon as 2.75 is officially out.

      And thanks for your kind words. It makes a world of difference to me knowing that people are benefiting from my code 🙂

      • Hey Dalai,

        thank you for the information. Now I know I wasn’t wrong and the feature’s on the blender roadmap soon. Currently I work with the experimental version from this post (win64). An updated build to 2.75 would be great as well, but only if it’s not too time consuming.

  13. Frederick Desimpel on July 22, 2015 at 10:43 pm said:

    Hi Dalai,

    I have a gear VR, would it be possible to get a ‘stereo cubemap’ from this stereo equirectanguar using the compositor… found a blender setup that did this for mono pano’s…

    or even better a camera type that does this directly…

    more info here :

    so i’d have to create two cubemaps from the equirectangular and then put them together in the carmack format…

  14. Frederick Desimpel on July 23, 2015 at 10:43 am said:

    i almost have the noodle setup to create a stereo cubemap from the stereo equirectangular render for orbx viewer.

    i want to test it , but the current stereo build freezes… any update to 2.75, dalai ?

    I’ll post a link here to the noodle when it is ready…

    thank you, Frederick

  15. Frederick Desimpel on July 25, 2015 at 2:52 am said:

    I made a blend file to generate stereo cubemaps for gearVR from the stereo equirectangular panorama’s.

    Maybe it is of use to someone on this thread…

    If something’s not working leave a message, i’ll look into it.

    greetings, FrederickD

  16. dfelinto on July 29, 2015 at 5:27 pm said:

    Hello all, once again thanks for being interested. I just updated the above links with fresh builds (on top of Blender 2.75a-ish). Enjoy!

    @Frederick, may I suggest something like a pre-rendered EXR to be used with the UV Node?

    Something like the method I used for fisheye rendering from Blender Internal:

  17. Paul on July 29, 2015 at 8:29 pm said:

    Hey! I grabbed the sweet new build, and the panorama camera options have vanished!

    I’m really excited to use this for stereo 3D 360s on youtube!

  18. dfelinto on August 19, 2015 at 1:45 pm said:

    Builds updated again with a fix for bumpmaps. Be aware that I also fixed an issue where top and bottom where flipped, so you will experience a regression.

  19. Justin on August 19, 2015 at 10:59 pm said:

    Thank you so much for this feature! I can’t wait to try it. I downloaded today’s build (Aug 19), but when I try to render I get a error “Camera ‘Camera’ is not a multi-view camera.” This happens only when I have RenderLayer/Views enabled, and set to Multi-View. Doesn’t matter if I tick left (eye) or right (eye) or both. What am I doing wrong?

  20. Justin on August 19, 2015 at 11:32 pm said:

    Never mind. I found the solution. I needed to create 2 cameras, position them correctly to account for inter-pupilary distance (IPD), then give them names like “MyCamera_L” and “MyCamera_R” (with suffixes matching the “Camera Suffix” that’s configured in RenderLayer/Views.

  21. Hey! Been trying out this feature for the last 24 hours, and I’m very happy with it! I just have two small issues that I thought I might get some insight on:

    1. We have one scene that constantly renders from the same POV, regardless of camera placement. Haven’t been able to figure out why.

    2. The panoramas seem to have an odd vertical “boarder” on the left and right edges of the final images, which creates an odd vertical line from zenith to nadir when viewed in an HMD. Has anyone else had this effect, and do they know how to correct it?

    • dfelinto on October 20, 2015 at 1:24 am said:

      Hi Darren,

      1. No idea, can you share the file? Are you using the equirectangular panorama render or the cube map addon method?
      2. Are you using any compositing effect? Anything similar to blur or vignetting effect will stand out badly in the panorama.

  22. Thank you so much for the fantastic work, I’ve had a fantastic experience with the Multiview Spherical Stereo.
    I’ve been having some difficulty with the cube map rendering add-on, unfortunately. It seems like the zenith frame isn’t lining up to the others. Do you have any suggestions for what I might be doing wrong?

    If the add-on temporarily creates Left/Right cameras in every direction and directly renders from those cameras, there will inevitably be stitching errors (unlike the bourke slice method you linked to above). I checked out the source, but my Python is rusty, so I couldn’t tell if this is the case

    • dfelinto on October 20, 2015 at 1:27 am said:

      Hi James,
      Thanks for your interest on my work.

      > It seems like the zenith frame isn’t lining up to the others. Do you have any suggestions for what I might be doing wrong?

      Can you reproduce this in a simple file and send it my way?

      > If the add-on temporarily creates Left/Right cameras in every direction and directly renders from those cameras, there will inevitably be stitching errors (…)

      The addon is to be used on top of the spherical-stereo branch with the “Use Spherical Stereo” option on, the perspective lens, Cycles and pivot set to middle. It shouldn’t cause any stitching error (it’s what the spherical-stereo branch is for).

      • Justin on March 18, 2016 at 1:39 am said:

        Frederick Desimpel, your Cube Map blend file is awesome! I came across the same issue that James mentioned above. Fortunately the fix is simple: In the “Render Stereo Cube” node editor (top center pane), the top-right-most “Render Layers” input is set incorrectly to “L_right”. It just needs to be changed to “R_right” (by using the Browse Scene button — it’s already contained in the list.) After that change, upon F12, the scene renders correctly.

        Thank you for a clever solution!

      • Justin on March 31, 2016 at 3:55 am said:

        Quick update: When running the CubeMap blend file in Blender 2.77, the final render appears nearly completely white (as if it’s been blown out). Any thoughts? (For now I’ll continue to use 2.76b.) Thanks!

  23. I installed Whirligig, downloaded your Top-Bottom Spherical Stereo Rectangular render and it doesn’t work.

    The image loads but it’s distorted. What format should I choose in Whirligig to view the render correctly? None of the standard ones display this properly.

    If you modified Whirligig to display your format could we get a copy of your modified version??

  24. keithm on February 4, 2016 at 7:34 pm said:

    Hey! Outstanding work! Really well done (and super problem solving). Looking forward to seeing it integrated for real and also whatever great new tricks you have up your sleeve.

  25. Justin on March 18, 2016 at 3:58 am said:

    Hi dfelinto,
    I’ve been using your code in 2.76b to output Stereo 3D in equirectangular format. When viewed through GearVR and DK2, I’ve noticed that my eyes have a hard time focusing on details at certain angles. For example, in a render of a car interior, when attempting to focus on the stick shift down to my right, it becomes uncomfortable. I actually have to cock (ie tilt) my head to the right a bit as well, and find the sweetspot… otherwise the left image is not parallel to the right image and my eyes cannot converge.

    I don’t know if I’m making any sense, but I wondered if anyone else has noticed similar?

    If I look at stereoscopic renders from OTOY, etc, I don’t see the same problem; I don’t need to tilt my head side to side. I wonder if it’s particular to Blender?
    Is there any chance the algos are rotating the cameras other axes when they shouldn’t (or aren’t when they should?), or somehow over/undercompensating the effect at particular angles? (I’m not even sure I’m asking the right question, hah!)

    Just thought I’d pop the question. Thanks!

    • Matthew D. Carson on May 10, 2016 at 6:05 am said:

      I just spent all evening researching why my latest stereo render from Blender has broken perspective, when the previous render of nearly the same scene looked ok!
      Now I see Justin describing my exact issue. It doesn’t even seem consistent; looking to my left and down I have to tilt my head sideways 90 degs to get the viewpoint to merge well. Looking to my right and down however requires the same a tilt but only half as much angle! That is probably due to a non-symmetric scene; these panorama images aren’t easy to mentally unwrap.
      I need to create a ‘ruler’ scene to help quantify what is going on here.
      (seen on my Gear VR).

      • dfelinto on June 9, 2016 at 3:13 pm said:

        Have you tried rendering with parallel convergence (either set the convergence distance to really high, or use parallel – the parallel will only work with a recent Blender or the upcoming Blender 2.78).

        I’m not sure about GearVR but I’ve heard that Youtube requires this.

    • Justin on May 19, 2016 at 9:27 pm said:

      Sooo, if anyone stumbles across this blog, dfelinto just tweeted that Sergey Sharybin (sergey) has developed a Pole Merge function. That’s great news; I’m assuming it will fix this issue.

  26. keithm on March 31, 2016 at 10:19 pm said:

    So this didn’t get rolled into 2.77? How come?

  27. Justin, yes, that alignment issue is very ‘distracting’ IRL as well– and generally due to a torquing of eye-views upon rotation. Your real eyes have compensatory muscles to resolve this — a conflict of verticality out of vestibular plane– with matching neural alignment compensation, as do most animals. (Vision science anatomy is my specialty)

    Check to ensure that cameras parented to center can rotate together in 6D. They need to have a vertical “spring” (like the old gimbaling problem) to avoid conflict, and probably do invisor. There may be a conflict that the eyes are trying to solve that the cameras try to anticipate. In effect, convergence plane may not be rotating: though we know that sounds silly, getting one’s head around this is difficult.
    I’m away from my computer, typing on a phone in airport, but desperate to understand this problem.

    Dalai, you are amazing. Thanks for all your work.

  28. Jake on July 7, 2016 at 1:57 pm said:

    Hey Dalai, I have a few questions and I hope you have time to answer them. As stated in your article: “The image would work great for the virtual objects in front of you, but it would have the stereo eyes swapped when you look at behind you”. Now the problem is that when I render the panoramic image in cycles, I then use a panoramic photo viewer that I downloaded from the android market to view the picture in VR. Im not sure what is going on, but the problem where the stereo eyes swap is still happening whenever I look 180 degrees behind the front.

    Any help is greatly appreciated, thanks for your hard work!

    • dfelinto on July 8, 2016 at 12:31 pm said:

      Hi Jake, in order to benefit from the technique presented here you need a coy of Blender 2.78 (yet to be released), as well as turning Spherical Stereo on in the camera panel. You can get it a snapshot of it on, while waiting for the upcoming release.

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation