Dear visitor, welcome!

This week I visited the Blender Institute and decided to wrap up the multiview project. But since I had an Oculus DK2 with me I decided to patch multiview to support Virtual Reality gadgets. Cool, right?

Oculus DK2 and Gooseberry

Gooseberry Benchmark viewed with an Oculus DK2

There is something tricky about them. You can’t just render a pair of panoramas and expect them to work. The image would work great for the virtual objects in front of you, but it would have the stereo eyes swapped when you look at behind you.

How to solve that? Do you remember the 3D Fulldome Teaser? Well, the technique is the exactly same one. We start by determining an interocular distance and a convergence distance based on the stereo depth we want to convey. From there the software (Cycles) will rotate a ‘virtual’ stereo camera pair for each pixel to be rendered, so that both cameras’ rays converge at the specified distance.

gooseberry-oculus

Oculus barrel correction screen shader applied to a view inside the panorama

This may sound complicated, but it’s all done under the hood. If you want to read more about this technique I recommend this paper from Paul Bourke on Synthetic stereoscopic panoramic images. The paper is from 2006 so there is nothing new under the Sun.

If you have an Oculus DK2 or similar device, you can grab the final image below to play with. I used Whirligig to visualize the stereo panorama, but there are other alternatives out there.

Gooseberry Benchmark Panorama

Top-Bottom Spherical Stereo Equiretangular Panorama – click to save the original image

This image was generated with a spin off branch of multiview named Multiview Spherical Stereo. I’m still looking for a industry standard name for this method. But in the meanwhile that name is growing on me.

I would also like to remark the relevance of Open projects such as Gooseberry. The always warm-welcoming Gooseberry team just released their benchmark file, which I ended up using for those tests. To be able to get a production quality shot and run whatever multi-vr-pano-full-thing you may think of is priceless.

Builds

If you want to try to render your own Spherical Stereo Panoramas, I built the patch for the three main platforms.

* Don’t get frustrated if the links are dead. As soon as this feature is officially supported by Blender I will remove them. So if that’s the case, get a new Blender.

How to render in three steps

  1. Enable ‘Views’ in the Render Layer panel
  2. Change camera to panorama
  3. Panorama type to Equirectangular

And leave ‘Spherical Stereo’ marked (it’s on by default at the moment). Remember to post in the comments the work you did with it!

 

Last and perhaps least is the small demo video above. The experience of seeing a 3D set doesn’t translate well for the video. But I can guarantee you that the overall impression from the Gooseberry team was super positive.

Also, this particular feature was the exact reason I was moved towards implementing multiview in Blender. All I wanted was to be able to render stereo content for fulldomes with Blender. In order to do that, I had to design a proper 3D stereoscopic pipeline for it.

What started as a personal project in 2013 ended up being embraced by the Blender Foundation in 2014, which supported me for a 2-month work period at the Blender Institute via the Development Fund. And now in 2015, so close to the Multiview completion, we finally get the icing on the cake.

No, wait … the cake is a lie!

Links

  • Multiview Spherical Stereo branch [link] *
  • Multiview: Cycles Spherical Stereo Support Official Patch [link] *
  • Gooseberry Production Benchmark File [link]
  • Support the Gooseberry project by signing up in the Blender Cloud [link]
  • Support further Blender Development by joining the Development Fund [link]

* Time traveller from the future, hi! If the branch doesn’t exist anymore, it means that the work was merged into master.

Nice Oculus

Thanks! This is not mine though 🙂 Oculus is one of the supported platforms of the Blender-VR project, to be presented at the IEEEVR 2015 next week.

If you are interesting in interactive virtual reality and need an open source solution for your CAVE, multiple Oculus or video wall, give Blender-VR a visit. I’m participating in the development of a framework built on top of the Blender Game Engine.

Also if Oculus feels like sending me my own Oculus, I wouldn’t mind. If you do, though, consider sending one to the Blender Foundation as well. I will feel bad when I take the device away from them next week.

Have a good one,
Dalai

Update:

Due to the long review process the patch is not yet in Blender. That said, since there were enough people interested on this feature, I just updated the links above with a more recent build (on top of current Blender 2.76 RC3).

Update:

The build now also supports regular perspective cameras. This is required for cube map vr renders. For this I also recommend an addon that I was commissioned to build, to render or to simply setup cubemap renders [link].

Note: remember to change your camera pivot to center.

* Last build update: October 2nd 2015

Hello all,
I’m pleased to announce that the latest version of Blender 3D is out. This is the collaborative effort of a team of developers, which I’m proudly a part of. The 2.65 edition is particularly relevant for the dome community due to a complete support for equidistant fisheye lens rendering.

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

That includes a series of fixes since last release, the most noticeable been the Equidistant Fisheye Lens fix, as I mentioned in an early post. This release not only benefit Fulldome artists, but also anyone willing to experiment with the Equisolid Fisheye lens. The image below is what you see from within the working viewport.

And how simple it is to use this? For those familiar with the builtin Cycles render engine this is as easy as it gets. For Equidistant fisheye lens all you need to do is to set the render dimensions to square (e.g., 1024 x 1024) and to set the field of view angle (usually 180°). For Equisolid fisheye lens you need to do is to set the lens size and one of the sensor dimensions. The other sensor dimension is taken from the render aspect ratio.

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio - www.patazstudio.com

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio – www.patazstudio.com

For the complete release information, please visit the official Blender 2.65 Release Log.

For the Fisheye Lens specific info, check:

Blender is under 60MB to download, free and more than capable to handle small to medium size productions. Go get it! I hope it can help more future fulldome productions in the future.

Enjoy it,
Dalai

I’m just back from the Siggraph Asia 2012. I was impressed by the people I met, the talks and courses I attended, and why not, the places I visited. Singapore is a very interesting city for a tourist. Among the people I met, a particular meeting was long overdue. I finally had a chance to meet Paul Bourke personally.

We collaborated (meaning he helped me ;)) in the fisheye implementation for the Blender Game Engine back in 2009. Since then there is not a fisheye related question that I don’t bounce by him first. So, in between talks he kindly shared his thoughts for stereoscopic rendering for domes. It took me a week to work around the problem, but here you can see the first real renders in a patched Blender with Cycles.

The formula is very simple, it’s just one of those problems that is really hard to debug (at least when you don’t have a dome with 3d projectors at hand). Thankfully in my flight back I had the peace of mind to wrap that up.

3D Model The White Room cortesy from Jay-Artist, shared on blendswap.com

As a teaser, this is all you get for now. More on that later 😉

Today was a meditative day to once again celebrate the passage of our Sun across the exactly-ish point in the sky. Yup, I’m getting old, but hey  don’t we all? As my self-birthday gift I decided to face the hills from home to work with my laptop in the backpack (big thing for me, I would always take the bus when I need the laptop, which means almost everyday).

Equirectangular Color Map Twisted

To make the self-celebration even more complete, I decided to not only work around my physical healthy, but also to give my mind some food for thought. In the past week, professor Adriano Oliveira was troubleshooting the cycles fisheye camera for his fulldome audiovisual work. He noticed that the equidistant lens was far off the expected (for example, compare it with the BGE fisheye lens) and even the equisolid (which is great to simulate real fisheye lens (like Nikkon, Canon, …) wasn’t producing a perfect fisheye.

Equirectangular Color Map

We have been debating that, and in the end he found some nice online references for different formulas per lens type. So today I thought it was a good time to get down the code. What you see next is the comparison between the wrong and the corrected equidistant fisheyes, the equirectangular testmap I used (as known as Blender UV Color Grid ;)) and something I’m passionated about now: to use drawing softwares to do math. You can delete things, move them around, color them .. it’s perfect 😉

 

 

So have fun with the fixed “fulldome” mode. Remember, the mode can go beyond 360 degrees if you want to make some really cool images 😉

Saving trees and abusing my tablet 😉

Now, something nice … I met prof. Adriano last year in the BlenderPRO (Brazilian Blender Conference). He may have exaggerated, but he told me that the main reason he went to the BlenderPRO was to meet me. It seems that it definitely paid off. I’m not saying I wouldn’t fix this bug if someone else had reported it. But it’s much more interesting to work with someone you met.

And why am I saying that? Well, next week we have a new edition of the BlenderPRO. So if you can make to Brasilia, don’t think twice. It’s going to be an amazing event, and I hope to see you there.

And happy birthday to me 🙂

Some time ago Paul Bourke sent me some images he captured with the Red Scarlet and a 4.5mm lens. The result is really impressive. He can get a recording in crystal clear 4K at 30fps. Below you can see one of his images:

Red Scarlet sample photo – credits Paul Bourke + synthetic elements by yours truly

Wait, what is Suzanne doing there?

Ok, that’s not really his original capture. I wanted to explore how would be to insert virtual elements in a fisheye image. It shouldn’t be much different than integrating synthetic elements in a panorama (topic of some previous blog entries and a paper waiting for approval 😉 – more on that later this year ). And as it turned out, it’s ‘straightforward’-ish enough.

First take a look at the original image:

Red Scarlet sample photo – credits Paul Bourke

This is a cropped image, expanded vertically to fill the 180 FOV (field of view). This arrange of camera+lens+4k doesn’t give you a full frame fisheye nor a circular fisheye. As a curiosity, the Red Scarlet can get a complete 180 fisheye circle if the photo is made in 5k. However you can’t get a 30fps movie capture at that resolution.

In order to use the IBL Toolkit for the scene reconstruction I first generated a full panorama (360×180) out of the original fisheye photo. I used the open source tool Hugin for that.

Be aware that Hugin has a bug in the calculation of the  focal length multiplier for equisolid fisheye lens (basically it’s using the full frame fisheye calculation for all its fisheye modes). Actually if you know someone involved in Hugin/Panotools project, I would send her/him over this patch. As far as I can tell the fix is along these lines. I couldn’t manage to compile Hugin though, so I don’t feel like sending a not-working patch for their tracker.

Back on topic … this is the image I got from Hugin (using 4.5 as lens and 2.35 as scale factor for equisolid – 2.35 was eyeballed because I couldn’t find in the internet the sensor size of the 4K capture for the Red Scarlet, and remember, once they fix the software the input would have to be different):

360×180 fullpanorama

 

Once I got the full panorama the rest of a piece of cake. This scene is perfect for the IBL Toolkit (this square in the front plane is practically screaming “Calibrate with me !!11!!”).

Blender IBL Toolkit in Action

And a render from a different angle (where glitches are expected). I used the Project UV option of IBL Toolkit to project the corresponding UV in the panorama to the subdivided meshes.

Extra ‘render’ – more a behind the scenes shot instead

 

Final considerations:

  • I really wish Blender had a shadow-only shader to help integrate support meshes, synthetic elements and a background plate.
  • I’m pretty sure Blender Institute crew worked nicely around that for the Tears of Steel project. I’m still waiting for them to finish the movie and release the files though.
  • The lighting is indeed bad here because the original plate was a LDR, not an HDR, so I didn’t have the lighting of the scene (and didn’t want to bother recreating it – thus you see no shadow in the original scene support elements).
  • If I had the HDR I would use Luxrender (AR Luxrender actually) for the render 🙂
  • IBL Toolkit should be called Pano something instead, anyways 😉
  • I forgot to say that the final render was only possible due to the Fisheye Lens in Cycles, a patch that I wrote on top of Brecht’s original full panorama code and is already on trunk (and will be available in Blender 2.64).
  • In fact I’m sure I could have fisheye implemented as an input option for the IBL Toolkit (discarding the need of Hugin). That would help to output the content in the exactly same position as the original camera (if you put them side-by-side you can see they have a slightly different orintation).

I’m planning to present a complete framework for working with panoramas and virtual elements in the Blender Conference this year. Even though this is based of my work with Aldo Zang (using Luxrender and not Blender) I think it can help to inspire possible solutions for Blender. So finger crossed for the presentation to be accepted and I hope we can make it interesting. The original paper (submitted to CLEI 2012 goes by the name:

Production framework for full panoramic scenes with photo-realistic augmented reality

 

So stay tuned, (and enjoy the Summer, Vancouver is finally sunny o/)
Dalai

(thanks Paul Bourke for authorizing the re-use of his image, consider giving his website a visit, its one of these corners of the internet that will keep you busy for a long time)

* updated on 23.05.212 *

I’m writing an addon to help with scene reconstruction and lighting with IBL files. It also works as a handy workflow to expand panoramas with rendered elements.

It’s still in its Beta version and I’m evaluating it in production only now, so things will likely change.

quick render test

However if you want to take a first glance at it you will need:

Some screens:

blender, top right cycles rendering background, top left addon to add the background with glsl shader

ARLuxrender at work

Note: my goal is to use arluxrender as the final renderer, but all the modelling and editing is to be done inside Blender.

The original teaser with old screenshots can be found here: http://www.dalaifelinto.com/?p=377

For further discussions you can visit the Blender Artists forum thread as well.

 

If you do some real testing with it, please let me know your thoughts.

Dalai

Project developed with Aldo Zang at the Visgraf lab

Not only of domes can an artist leave of. The fisheye mode shown in the previous post is sufficient for planetarium productions, but may not satisfy artists looking for ‘real’ fisheye lens. The most common lens found in today’s market are ‘equisolid’ lens (ref: HDRI for CGI).

After a productive holiday we now have something new to play with 😉

model by Jar-Artist

model by Jar-Artist

Models/scene gently provided by Jar-Artist
http://www.blendswap.com/blends/author/jay-artist/

 

The proof is in the pudin

It’s “easy” to put up a nice artistic effect together and simply assume everything is working as it should. However, I wanted to make a system that could match the effect produced by real lens. For that I had to build a fail proof experiment. People not into coding may not know, but building a reliable testing and debugging setup is one of the keys for efficient coding. In my opinion it’s always worthy to put time into this. Note: this is also why Blender users can help a lot with bug fixing by simply building proper test files for the (also carefully/methodologically) reported bugs).

1 – Photoshooting

Take some pictures with a tripod rotating the camera on its center (the focal centre actually). We have been doing this for the past two weeks so it was smooth. Those pictures were taken by Aldo Zang in the Visgraf Lab at IMPA.

2 – Stiching

I don’t get tired of recommending Hugin for stitching and panorama making – hugin.sourceforge.net. This open source project sometimes works better even than autopano pro (a pretty good commercial alternative).

 

3 – Rendering

I put the panorama as a background plate, calibrated the aligment, added a simple floor + spheres. This was done with the (yet to be released) IBL Toolkit. Apart from that my Blender camera needs to match the settings of the real camera+lens I’m aiming at.

In this case all the pictures for the stitching were taken with a Nikon DX2S and a fisheye 10.5mm lens. I created new presets for the sensor dimensions and the render output.

4 – Results

I was quite pleased when I compared the rendered output with the original image. The aspects we should be looking at are only field of view and line distortion across the lens:

 

Also note the bottom left corner of the photo. This subtle shadowing is due to the vignetting of the lens. This is not present in the cycles render because I’m not implementing a real camera model (as shown here and here).

Algorithm

The complete patch can be found here. The core is the following function (simplified here). I elaborated this from the ‘classic’ fisheye equisolid formula: radius = 2 * focallens * sin ( angle / 2):

__device float3 fisheye_equisolid_to_direction(
float u, float v, float lens, float width, float height)
{
    u = (u - 0.5) * width;
    v = (v - 0.5) * height;

    float r = sqrt(u*u + v*v);
    float phi = acos(u/r);
    float theta = 2 * asin(r/(2 * lens));

    if (v < 0) phi = -phi;

    return make_float3(
        cos(theta),
        -cos(phi)*sin(theta),
        sin(phi)*sin(theta)
)

I hope you like it. If you can share any work you did with this feature I would love to see it.
Dalai

What if we could render fisheye images directly from Blender? Yesterday I found out about the Equirectangular mode in Cycles. It got me quite thrilled (it’s been awhile since I was waiting for that).

This is only possible because Cycles is a full ray tracer render engine. Every pixel in the image is generated from a ray coming from the camera to anywhere in the scene. Enough talking. A quick hack in the code and tcharan:

 

IBL background plate + ibl toolkit (alignment addon) + cycles 'use panorama' + fisheye patch

And the nice thing is, it previews in 3D just as well:

What comes next? I will talk with Brecht to see if there is any pending design to have this implemented as another camera option. I would like to have an option to set the angle (so we don’t need to do only 180 degrees fisheyes). And to toggle between hemispherical and angular fisheye modes.

I you compile your own Blender and want to try the patch, get it here or:

Index: intern/cycles/kernel/kernel_montecarlo.h
===================================================================
--- intern/cycles/kernel/kernel_montecarlo.h    (revision 45899)
+++ intern/cycles/kernel/kernel_montecarlo.h    (working copy)
@@ -215,13 +215,29 @@
 
 __device float3 equirectangular_to_direction(float u, float v)
 {
+   u = (u - 0.5f) * 2.f;
+   v = (v - 0.5f) * 2.f;
+
+   float r = sqrt(u*u + v*v);
+   float theta = acosf(u/r);
+
+   if (v < 0.f) theta = -theta;
+
+   return make_float3(
+       sqrtf((1.f - r*r)),
+       -cosf(theta)*r,
+       sinf(theta)*r
+   );
+
+
+/*
    float phi = M_PI_F*(1.0f - 2.0f*u);
    float theta = M_PI_F*(1.0f - v);
-
    return make_float3(
        sin(theta)*cos(phi),
        sin(theta)*sin(phi),
        cos(theta));
+*/
 }
 
 /* Mirror Ball <-> Cartesion direction */

 


Dalai
* IBL from HDR Labs
* IBL Toolkit explained here.

What do you do when you have 2 idle projectors by your computer? The answer is obviously a high definition projection area to be filled with lo.v.e. (lots of valuable experiments).

Two short throw projectors in one seamless desktop

I’ve been following the work of the Vision3D since 2009. This lab in Montreal is specialized in computer vision (recherche fondamentale et appliquée sur les aspects tridimensionnels de la vision par ordinateur). Lead by Sébastien Roy they have been producing (and sharing!) on calibration of projection surface (e.g. domes o/), multiple projector systems, and content toolsets.

lt-align manual calibration process

The Vision3D lab main tool in that area is Light Twist. This tool was presented in the LGM2009 with a live showcase of the system in a cylinder. In the last week I tried to have light twist going with a multi projector system (aiming to use this for a dome later on) but so far I’m stuck in the playback of content (and I suspect the calibration stage is wrong). Anyways, light twist will be a topic of another post, once I get it up and running.

Plugin enabled – video in the middle of the screens, desktop working normally

Since 2009 the light twist project shifted its focus from labs to end users. In 2011 they finally presented a new project called lt-align and lt-compiz-plugin. The lt-align is a software to quickly calibrate the screens alignment, very easy to use.


The Compiz plugin requires some fooling around with ubuntu settings, but once things are in place it works like a charm. I’m yet to make it work with Unity, so I can have real fullscreen across the desktops.

Recording of the alignment process and video playback

Elephants Dream – Stitched Edition 😉

Note: there is an extra package you need to compile the lt-compiz-plugin:`sudo apt-get install compiz-plugins-main-dev. And I didn’t have to restart compiz with ccp to make it work. Also I changed the shortcuts to start the plugin because Alt+F* were taken by other OS commands.

Time to make it real and project in a large wall

In this picture you can see Djalma Lucio, sys admin that oversees all the computer installations at Visgraf on IMPA. A great professional and a very funny guy to work with. Think about someone that actually enjoys opening a xorg.conf file. And you can also see in the right Aldo Zang. Check it out his ARLuxRender project – a plugin system for LuxRender “which allows to render scenes with mixtures of real and virtual objects directly, without post-processing”.

I hope to post more in the coming months in domes, projections, a special video project … 😉 I went on a 3-month leave of my work at UBC to join the research lab at Visgraf/IMPA, under the coordination of prof. Luiz Velho. This is the second week only, but it’s been already a great experience. And above all, it’s nice to be back home (Rio de Janeiro, Brazil).

Happy Twisting,
Dalai