I’m just back from the Siggraph Asia 2012. I was impressed by the people I met, the talks and courses I attended, and why not, the places I visited. Singapore is a very interesting city for a tourist. Among the people I met, a particular meeting was long overdue. I finally had a chance to meet Paul Bourke personally.

We collaborated (meaning he helped me ;)) in the fisheye implementation for the Blender Game Engine back in 2009. Since then there is not a fisheye related question that I don’t bounce by him first. So, in between talks he kindly shared his thoughts for stereoscopic rendering for domes. It took me a week to work around the problem, but here you can see the first real renders in a patched Blender with Cycles.

The formula is very simple, it’s just one of those problems that is really hard to debug (at least when you don’t have a dome with 3d projectors at hand). Thankfully in my flight back I had the peace of mind to wrap that up.

3D Model The White Room cortesy from Jay-Artist, shared on blendswap.com

As a teaser, this is all you get for now. More on that later 😉

What if we could render fisheye images directly from Blender? Yesterday I found out about the Equirectangular mode in Cycles. It got me quite thrilled (it’s been awhile since I was waiting for that).

This is only possible because Cycles is a full ray tracer render engine. Every pixel in the image is generated from a ray coming from the camera to anywhere in the scene. Enough talking. A quick hack in the code and tcharan:

 

IBL background plate + ibl toolkit (alignment addon) + cycles 'use panorama' + fisheye patch

And the nice thing is, it previews in 3D just as well:

What comes next? I will talk with Brecht to see if there is any pending design to have this implemented as another camera option. I would like to have an option to set the angle (so we don’t need to do only 180 degrees fisheyes). And to toggle between hemispherical and angular fisheye modes.

I you compile your own Blender and want to try the patch, get it here or:

Index: intern/cycles/kernel/kernel_montecarlo.h
===================================================================
--- intern/cycles/kernel/kernel_montecarlo.h    (revision 45899)
+++ intern/cycles/kernel/kernel_montecarlo.h    (working copy)
@@ -215,13 +215,29 @@
 
 __device float3 equirectangular_to_direction(float u, float v)
 {
+   u = (u - 0.5f) * 2.f;
+   v = (v - 0.5f) * 2.f;
+
+   float r = sqrt(u*u + v*v);
+   float theta = acosf(u/r);
+
+   if (v < 0.f) theta = -theta;
+
+   return make_float3(
+       sqrtf((1.f - r*r)),
+       -cosf(theta)*r,
+       sinf(theta)*r
+   );
+
+
+/*
    float phi = M_PI_F*(1.0f - 2.0f*u);
    float theta = M_PI_F*(1.0f - v);
-
    return make_float3(
        sin(theta)*cos(phi),
        sin(theta)*sin(phi),
        cos(theta));
+*/
 }
 
 /* Mirror Ball <-> Cartesion direction */

 


Dalai
* IBL from HDR Labs
* IBL Toolkit explained here.

There is a new addon landing. If you work with IBL in Blender for modeling come by soon. In the mean time enjoy the teaser (or poke me to provide some feedback and perhaps even join the alpha testing period).

IBL autosetup for Cycles and Luxrender

IBL Re-Aligment

Home made panoramas also work 😉

Point projection for calibration tweaking

Ideas, questions, comments, feel free to drop a line 😉

What do you do when you have 2 idle projectors by your computer? The answer is obviously a high definition projection area to be filled with lo.v.e. (lots of valuable experiments).

Two short throw projectors in one seamless desktop

I’ve been following the work of the Vision3D since 2009. This lab in Montreal is specialized in computer vision (recherche fondamentale et appliquée sur les aspects tridimensionnels de la vision par ordinateur). Lead by Sébastien Roy they have been producing (and sharing!) on calibration of projection surface (e.g. domes o/), multiple projector systems, and content toolsets.

lt-align manual calibration process

The Vision3D lab main tool in that area is Light Twist. This tool was presented in the LGM2009 with a live showcase of the system in a cylinder. In the last week I tried to have light twist going with a multi projector system (aiming to use this for a dome later on) but so far I’m stuck in the playback of content (and I suspect the calibration stage is wrong). Anyways, light twist will be a topic of another post, once I get it up and running.

Plugin enabled – video in the middle of the screens, desktop working normally

Since 2009 the light twist project shifted its focus from labs to end users. In 2011 they finally presented a new project called lt-align and lt-compiz-plugin. The lt-align is a software to quickly calibrate the screens alignment, very easy to use.


The Compiz plugin requires some fooling around with ubuntu settings, but once things are in place it works like a charm. I’m yet to make it work with Unity, so I can have real fullscreen across the desktops.

Recording of the alignment process and video playback

Elephants Dream – Stitched Edition 😉

Note: there is an extra package you need to compile the lt-compiz-plugin:`sudo apt-get install compiz-plugins-main-dev. And I didn’t have to restart compiz with ccp to make it work. Also I changed the shortcuts to start the plugin because Alt+F* were taken by other OS commands.

Time to make it real and project in a large wall

In this picture you can see Djalma Lucio, sys admin that oversees all the computer installations at Visgraf on IMPA. A great professional and a very funny guy to work with. Think about someone that actually enjoys opening a xorg.conf file. And you can also see in the right Aldo Zang. Check it out his ARLuxRender project – a plugin system for LuxRender “which allows to render scenes with mixtures of real and virtual objects directly, without post-processing”.

I hope to post more in the coming months in domes, projections, a special video project … 😉 I went on a 3-month leave of my work at UBC to join the research lab at Visgraf/IMPA, under the coordination of prof. Luiz Velho. This is the second week only, but it’s been already a great experience. And above all, it’s nice to be back home (Rio de Janeiro, Brazil).

Happy Twisting,
Dalai

My love for photomatching goes a long way.
Back in 2007 I did this project using the fantastic SketchUp Photo Match:

I used 20 photographies, a blueprint of a cross section and a blueprint of the original floor design. I was then hired to do the drawing of the façade with AutoCAD to be used for a study on preservation and historical register of this building.

Since then I realized that Blender was very far from catching up with tools designed with architects in mind.
Today I ran into an add-on for Blender that may help to reduce this gap.

BLAM is a Blender Calibration Tool that you can find here:
http://code.google.com/p/blam/

My original plan for tonight (to code support for green-magenta anaglyph glasses in the Blender Game Engine :)) clearly would have to wait. It’s time to test the tool!

I was following the steps of the video tutorial – BLAM Video Tutorial
If you want to try yourself this is the picture I used:
University of Seattle
It’s a picture from the University of Seattle. I traveled to Seattle last year and really enjoyed the university campus (and the BattleStar Galactica exhibition at the Space Needle alone made the trip worthwhile).

UV EDITOR

  1. adding axis is nice and intuitive but it would nice to tweak the curves for fine tuning while seeing the 3d change (as a live ‘estimate camera focal length and orientation’ mode)
  2. an option to automatically add the image as image background would be nice.