I’m just back from the Siggraph Asia 2012. I was impressed by the people I met, the talks and courses I attended, and why not, the places I visited. Singapore is a very interesting city for a tourist. Among the people I met, a particular meeting was long overdue. I finally had a chance to meet Paul Bourke personally.

We collaborated (meaning he helped me ;)) in the fisheye implementation for the Blender Game Engine back in 2009. Since then there is not a fisheye related question that I don’t bounce by him first. So, in between talks he kindly shared his thoughts for stereoscopic rendering for domes. It took me a week to work around the problem, but here you can see the first real renders in a patched Blender with Cycles.

The formula is very simple, it’s just one of those problems that is really hard to debug (at least when you don’t have a dome with 3d projectors at hand). Thankfully in my flight back I had the peace of mind to wrap that up.

3D Model The White Room cortesy from Jay-Artist, shared on blendswap.com

As a teaser, this is all you get for now. More on that later 😉

Today was a meditative day to once again celebrate the passage of our Sun across the exactly-ish point in the sky. Yup, I’m getting old, but hey  don’t we all? As my self-birthday gift I decided to face the hills from home to work with my laptop in the backpack (big thing for me, I would always take the bus when I need the laptop, which means almost everyday).

Equirectangular Color Map Twisted

To make the self-celebration even more complete, I decided to not only work around my physical healthy, but also to give my mind some food for thought. In the past week, professor Adriano Oliveira was troubleshooting the cycles fisheye camera for his fulldome audiovisual work. He noticed that the equidistant lens was far off the expected (for example, compare it with the BGE fisheye lens) and even the equisolid (which is great to simulate real fisheye lens (like Nikkon, Canon, …) wasn’t producing a perfect fisheye.

Equirectangular Color Map

We have been debating that, and in the end he found some nice online references for different formulas per lens type. So today I thought it was a good time to get down the code. What you see next is the comparison between the wrong and the corrected equidistant fisheyes, the equirectangular testmap I used (as known as Blender UV Color Grid ;)) and something I’m passionated about now: to use drawing softwares to do math. You can delete things, move them around, color them .. it’s perfect 😉

 

 

So have fun with the fixed “fulldome” mode. Remember, the mode can go beyond 360 degrees if you want to make some really cool images 😉

Saving trees and abusing my tablet 😉

Now, something nice … I met prof. Adriano last year in the BlenderPRO (Brazilian Blender Conference). He may have exaggerated, but he told me that the main reason he went to the BlenderPRO was to meet me. It seems that it definitely paid off. I’m not saying I wouldn’t fix this bug if someone else had reported it. But it’s much more interesting to work with someone you met.

And why am I saying that? Well, next week we have a new edition of the BlenderPRO. So if you can make to Brasilia, don’t think twice. It’s going to be an amazing event, and I hope to see you there.

And happy birthday to me 🙂

Not only of domes can an artist leave of. The fisheye mode shown in the previous post is sufficient for planetarium productions, but may not satisfy artists looking for ‘real’ fisheye lens. The most common lens found in today’s market are ‘equisolid’ lens (ref: HDRI for CGI).

After a productive holiday we now have something new to play with 😉

model by Jar-Artist

model by Jar-Artist

Models/scene gently provided by Jar-Artist
http://www.blendswap.com/blends/author/jay-artist/

 

The proof is in the pudin

It’s “easy” to put up a nice artistic effect together and simply assume everything is working as it should. However, I wanted to make a system that could match the effect produced by real lens. For that I had to build a fail proof experiment. People not into coding may not know, but building a reliable testing and debugging setup is one of the keys for efficient coding. In my opinion it’s always worthy to put time into this. Note: this is also why Blender users can help a lot with bug fixing by simply building proper test files for the (also carefully/methodologically) reported bugs).

1 – Photoshooting

Take some pictures with a tripod rotating the camera on its center (the focal centre actually). We have been doing this for the past two weeks so it was smooth. Those pictures were taken by Aldo Zang in the Visgraf Lab at IMPA.

2 – Stiching

I don’t get tired of recommending Hugin for stitching and panorama making – hugin.sourceforge.net. This open source project sometimes works better even than autopano pro (a pretty good commercial alternative).

 

3 – Rendering

I put the panorama as a background plate, calibrated the aligment, added a simple floor + spheres. This was done with the (yet to be released) IBL Toolkit. Apart from that my Blender camera needs to match the settings of the real camera+lens I’m aiming at.

In this case all the pictures for the stitching were taken with a Nikon DX2S and a fisheye 10.5mm lens. I created new presets for the sensor dimensions and the render output.

4 – Results

I was quite pleased when I compared the rendered output with the original image. The aspects we should be looking at are only field of view and line distortion across the lens:

 

Also note the bottom left corner of the photo. This subtle shadowing is due to the vignetting of the lens. This is not present in the cycles render because I’m not implementing a real camera model (as shown here and here).

Algorithm

The complete patch can be found here. The core is the following function (simplified here). I elaborated this from the ‘classic’ fisheye equisolid formula: radius = 2 * focallens * sin ( angle / 2):

__device float3 fisheye_equisolid_to_direction(
float u, float v, float lens, float width, float height)
{
    u = (u - 0.5) * width;
    v = (v - 0.5) * height;

    float r = sqrt(u*u + v*v);
    float phi = acos(u/r);
    float theta = 2 * asin(r/(2 * lens));

    if (v < 0) phi = -phi;

    return make_float3(
        cos(theta),
        -cos(phi)*sin(theta),
        sin(phi)*sin(theta)
)

I hope you like it. If you can share any work you did with this feature I would love to see it.
Dalai