Match made in e-heaven

Story originally published in blender.org in July 28th, 2016.

Meet e-interiores. This Brazilian interior design e-commerce startup transformed their creation process into an entire new fashion. This tale will show you how Blender made this possible, and how far we got.

We developed a new platform based on a semi-vanilla Blender, Fluid Designer, and our own pipelines. Thanks to the accomplished results, e-interiores was able to consolidate a partnership with the giant  Tok&Stok providing a complete design of a room, in 72 hours.

Read More →

Not only of domes can an artist leave of. The fisheye mode shown in the previous post is sufficient for planetarium productions, but may not satisfy artists looking for ‘real’ fisheye lens. The most common lens found in today’s market are ‘equisolid’ lens (ref: HDRI for CGI).

After a productive holiday we now have something new to play with 😉

model by Jar-Artist

model by Jar-Artist

Models/scene gently provided by Jar-Artist
http://www.blendswap.com/blends/author/jay-artist/

 

The proof is in the pudin

It’s “easy” to put up a nice artistic effect together and simply assume everything is working as it should. However, I wanted to make a system that could match the effect produced by real lens. For that I had to build a fail proof experiment. People not into coding may not know, but building a reliable testing and debugging setup is one of the keys for efficient coding. In my opinion it’s always worthy to put time into this. Note: this is also why Blender users can help a lot with bug fixing by simply building proper test files for the (also carefully/methodologically) reported bugs).

1 – Photoshooting

Take some pictures with a tripod rotating the camera on its center (the focal centre actually). We have been doing this for the past two weeks so it was smooth. Those pictures were taken by Aldo Zang in the Visgraf Lab at IMPA.

2 – Stiching

I don’t get tired of recommending Hugin for stitching and panorama making – hugin.sourceforge.net. This open source project sometimes works better even than autopano pro (a pretty good commercial alternative).

 

3 – Rendering

I put the panorama as a background plate, calibrated the aligment, added a simple floor + spheres. This was done with the (yet to be released) IBL Toolkit. Apart from that my Blender camera needs to match the settings of the real camera+lens I’m aiming at.

In this case all the pictures for the stitching were taken with a Nikon DX2S and a fisheye 10.5mm lens. I created new presets for the sensor dimensions and the render output.

4 – Results

I was quite pleased when I compared the rendered output with the original image. The aspects we should be looking at are only field of view and line distortion across the lens:

 

Also note the bottom left corner of the photo. This subtle shadowing is due to the vignetting of the lens. This is not present in the cycles render because I’m not implementing a real camera model (as shown here and here).

Algorithm

The complete patch can be found here. The core is the following function (simplified here). I elaborated this from the ‘classic’ fisheye equisolid formula: radius = 2 * focallens * sin ( angle / 2):

__device float3 fisheye_equisolid_to_direction(
float u, float v, float lens, float width, float height)
{
    u = (u - 0.5) * width;
    v = (v - 0.5) * height;

    float r = sqrt(u*u + v*v);
    float phi = acos(u/r);
    float theta = 2 * asin(r/(2 * lens));

    if (v < 0) phi = -phi;

    return make_float3(
        cos(theta),
        -cos(phi)*sin(theta),
        sin(phi)*sin(theta)
)

I hope you like it. If you can share any work you did with this feature I would love to see it.
Dalai

My love for photomatching goes a long way.
Back in 2007 I did this project using the fantastic SketchUp Photo Match:

I used 20 photographies, a blueprint of a cross section and a blueprint of the original floor design. I was then hired to do the drawing of the façade with AutoCAD to be used for a study on preservation and historical register of this building.

Since then I realized that Blender was very far from catching up with tools designed with architects in mind.
Today I ran into an add-on for Blender that may help to reduce this gap.

BLAM is a Blender Calibration Tool that you can find here:
http://code.google.com/p/blam/

My original plan for tonight (to code support for green-magenta anaglyph glasses in the Blender Game Engine :)) clearly would have to wait. It’s time to test the tool!

I was following the steps of the video tutorial – BLAM Video Tutorial
If you want to try yourself this is the picture I used:
University of Seattle
It’s a picture from the University of Seattle. I traveled to Seattle last year and really enjoyed the university campus (and the BattleStar Galactica exhibition at the Space Needle alone made the trip worthwhile).

UV EDITOR

  1. adding axis is nice and intuitive but it would nice to tweak the curves for fine tuning while seeing the 3d change (as a live ‘estimate camera focal length and orientation’ mode)
  2. an option to automatically add the image as image background would be nice.