A quick proof of concept test for the following technique:

  • environment re-mapped into the couch (sofa) – using IBL Toolkit
  • shape-key deformed and animated geometry

The final goal is to have the mapping handled internally by the render engine (Luxrender, although I used Cycles for this test). That way the light support geometry (e.g. the ceiling) can be transformed as well and the shadows should dance around.

They are two key elements here, to help  to produce this effect.

Sofa modeled using the background plate as reference

First of all we need to model the sofa geometry. For that we need to project the background in the 3d view as it will look in the final render. This is accomplished with a GLSL filter running on top of the 3dview.

Sofa UV mapped to the background panorama

The second part is to project the UV to match the original image. This would be really tricky if the object were in the edge of the image, but for this case is more doable. Both those problems are handled by IBLToolkit.

(and yes, I know the animation could be better and I could map only one of the pillows. This is a quick tech test though ;))

Render with ARLuxrender (branch of luxrender). A small teaser from a paper I’m writing with Aldo Zang (arlux developer). Try to guess what is real and what is fake here 😉

Render of the week - Blender + luxrender (arlux)

 

Background Plate + Lighting - captured environment

 

Against all the odds the sofas are real (so far it seems that most people tend to think they are the 3d elements). The carpet is pure 3D though (well, based of a 2D image actually). The spheres are also 3d, but we have a lot of the scene elements modeled as “support meshes”. That’s how the spheres can get the right reflection and lighting.

Also can you see what the carpet is covering ?
Ah, and this image has no post-processing or compositing on it. It comes blended with the real elements straight from the render.

Cheers,
Dalai

* updated on 23.05.212 *

I’m writing an addon to help with scene reconstruction and lighting with IBL files. It also works as a handy workflow to expand panoramas with rendered elements.

It’s still in its Beta version and I’m evaluating it in production only now, so things will likely change.

quick render test

However if you want to take a first glance at it you will need:

Some screens:

blender, top right cycles rendering background, top left addon to add the background with glsl shader

ARLuxrender at work

Note: my goal is to use arluxrender as the final renderer, but all the modelling and editing is to be done inside Blender.

The original teaser with old screenshots can be found here: http://www.dalaifelinto.com/?p=377

For further discussions you can visit the Blender Artists forum thread as well.

 

If you do some real testing with it, please let me know your thoughts.

Dalai

Project developed with Aldo Zang at the Visgraf lab

Not only of domes can an artist leave of. The fisheye mode shown in the previous post is sufficient for planetarium productions, but may not satisfy artists looking for ‘real’ fisheye lens. The most common lens found in today’s market are ‘equisolid’ lens (ref: HDRI for CGI).

After a productive holiday we now have something new to play with 😉

model by Jar-Artist

model by Jar-Artist

Models/scene gently provided by Jar-Artist
http://www.blendswap.com/blends/author/jay-artist/

 

The proof is in the pudin

It’s “easy” to put up a nice artistic effect together and simply assume everything is working as it should. However, I wanted to make a system that could match the effect produced by real lens. For that I had to build a fail proof experiment. People not into coding may not know, but building a reliable testing and debugging setup is one of the keys for efficient coding. In my opinion it’s always worthy to put time into this. Note: this is also why Blender users can help a lot with bug fixing by simply building proper test files for the (also carefully/methodologically) reported bugs).

1 – Photoshooting

Take some pictures with a tripod rotating the camera on its center (the focal centre actually). We have been doing this for the past two weeks so it was smooth. Those pictures were taken by Aldo Zang in the Visgraf Lab at IMPA.

2 – Stiching

I don’t get tired of recommending Hugin for stitching and panorama making – hugin.sourceforge.net. This open source project sometimes works better even than autopano pro (a pretty good commercial alternative).

 

3 – Rendering

I put the panorama as a background plate, calibrated the aligment, added a simple floor + spheres. This was done with the (yet to be released) IBL Toolkit. Apart from that my Blender camera needs to match the settings of the real camera+lens I’m aiming at.

In this case all the pictures for the stitching were taken with a Nikon DX2S and a fisheye 10.5mm lens. I created new presets for the sensor dimensions and the render output.

4 – Results

I was quite pleased when I compared the rendered output with the original image. The aspects we should be looking at are only field of view and line distortion across the lens:

 

Also note the bottom left corner of the photo. This subtle shadowing is due to the vignetting of the lens. This is not present in the cycles render because I’m not implementing a real camera model (as shown here and here).

Algorithm

The complete patch can be found here. The core is the following function (simplified here). I elaborated this from the ‘classic’ fisheye equisolid formula: radius = 2 * focallens * sin ( angle / 2):

__device float3 fisheye_equisolid_to_direction(
float u, float v, float lens, float width, float height)
{
    u = (u - 0.5) * width;
    v = (v - 0.5) * height;

    float r = sqrt(u*u + v*v);
    float phi = acos(u/r);
    float theta = 2 * asin(r/(2 * lens));

    if (v < 0) phi = -phi;

    return make_float3(
        cos(theta),
        -cos(phi)*sin(theta),
        sin(phi)*sin(theta)
)

I hope you like it. If you can share any work you did with this feature I would love to see it.
Dalai