Yummy! This was trending on Google Plus so I decided to give it a go. It’s very simple to do, it looks great and it tastes accordingly. I couldn’t find pre-cooked bacon so I had to pre-heat it before putting the eggs. It’s a bit tricky because I pre-heated them in the muffin molds already, so they started to loose their ’roundness’. Use chopsticks to get them out of the molds 😉

Some time ago Paul Bourke sent me some images he captured with the Red Scarlet and a 4.5mm lens. The result is really impressive. He can get a recording in crystal clear 4K at 30fps. Below you can see one of his images:

Red Scarlet sample photo – credits Paul Bourke + synthetic elements by yours truly

Wait, what is Suzanne doing there?

Ok, that’s not really his original capture. I wanted to explore how would be to insert virtual elements in a fisheye image. It shouldn’t be much different than integrating synthetic elements in a panorama (topic of some previous blog entries and a paper waiting for approval 😉 – more on that later this year ). And as it turned out, it’s ‘straightforward’-ish enough.

First take a look at the original image:

Red Scarlet sample photo – credits Paul Bourke

This is a cropped image, expanded vertically to fill the 180 FOV (field of view). This arrange of camera+lens+4k doesn’t give you a full frame fisheye nor a circular fisheye. As a curiosity, the Red Scarlet can get a complete 180 fisheye circle if the photo is made in 5k. However you can’t get a 30fps movie capture at that resolution.

In order to use the IBL Toolkit for the scene reconstruction I first generated a full panorama (360×180) out of the original fisheye photo. I used the open source tool Hugin for that.

Be aware that Hugin has a bug in the calculation of the  focal length multiplier for equisolid fisheye lens (basically it’s using the full frame fisheye calculation for all its fisheye modes). Actually if you know someone involved in Hugin/Panotools project, I would send her/him over this patch. As far as I can tell the fix is along these lines. I couldn’t manage to compile Hugin though, so I don’t feel like sending a not-working patch for their tracker.

Back on topic … this is the image I got from Hugin (using 4.5 as lens and 2.35 as scale factor for equisolid – 2.35 was eyeballed because I couldn’t find in the internet the sensor size of the 4K capture for the Red Scarlet, and remember, once they fix the software the input would have to be different):

360×180 fullpanorama

 

Once I got the full panorama the rest of a piece of cake. This scene is perfect for the IBL Toolkit (this square in the front plane is practically screaming “Calibrate with me !!11!!”).

Blender IBL Toolkit in Action

And a render from a different angle (where glitches are expected). I used the Project UV option of IBL Toolkit to project the corresponding UV in the panorama to the subdivided meshes.

Extra ‘render’ – more a behind the scenes shot instead

 

Final considerations:

  • I really wish Blender had a shadow-only shader to help integrate support meshes, synthetic elements and a background plate.
  • I’m pretty sure Blender Institute crew worked nicely around that for the Tears of Steel project. I’m still waiting for them to finish the movie and release the files though.
  • The lighting is indeed bad here because the original plate was a LDR, not an HDR, so I didn’t have the lighting of the scene (and didn’t want to bother recreating it – thus you see no shadow in the original scene support elements).
  • If I had the HDR I would use Luxrender (AR Luxrender actually) for the render 🙂
  • IBL Toolkit should be called Pano something instead, anyways 😉
  • I forgot to say that the final render was only possible due to the Fisheye Lens in Cycles, a patch that I wrote on top of Brecht’s original full panorama code and is already on trunk (and will be available in Blender 2.64).
  • In fact I’m sure I could have fisheye implemented as an input option for the IBL Toolkit (discarding the need of Hugin). That would help to output the content in the exactly same position as the original camera (if you put them side-by-side you can see they have a slightly different orintation).

I’m planning to present a complete framework for working with panoramas and virtual elements in the Blender Conference this year. Even though this is based of my work with Aldo Zang (using Luxrender and not Blender) I think it can help to inspire possible solutions for Blender. So finger crossed for the presentation to be accepted and I hope we can make it interesting. The original paper (submitted to CLEI 2012 goes by the name:

Production framework for full panoramic scenes with photo-realistic augmented reality

 

So stay tuned, (and enjoy the Summer, Vancouver is finally sunny o/)
Dalai

(thanks Paul Bourke for authorizing the re-use of his image, consider giving his website a visit, its one of these corners of the internet that will keep you busy for a long time)

Do you know Momo? He is a cute little monkey traumatized from being a second character in the Yo Frankie game project. Now what if you could carry Momo with you wherever you go? Your dream is closing to come true!

(wth are you writing about? — if I had an editor for my blog she would probably write that down)

Alex Ku is working in the Google Summer of Code 2012 to bring porting the Blender Game Engine to the Android platform. His work is progressing smoothly and there are already some visible-shareable results. Today he announced the first Blenderplayer.apk release, so I couldn’t help but testing it.

So it works, is that all? well, not really. There are still bugs and non-supported features. But skinning-armature, mouse click, glsl shaders (partially), physics they all work.

This sample file is part of the examples that go with the book I’m finishing up writing with Mike Pan. We are already on author review stage, so I should be able to talk more about it soon. Since I’m ‘giving the file away’ anyways I may as well explain it 🙂 This file showcase the use of dynamic parenting and bone parenting in BGE. It’s a good technique for character customization (as you can see with the hats).

Kudos for Alex Ku’s work and all the other developers that are helping this project,
Dalai

From time to time I go to re-visit some projects I keep track of. This week I resumed working in a BGE (Blender Game Engine) project and decided to test if the file would work online. How so? Burster is a webplugin for the BGE that allows you to embed (and even secure) your .blend files in a website.

Burster got some really good upgrades lately, and not it works as a plugin is expected to (it tells the user a new version is online, suggest it to update, …). So what you see next is a screencapture of the Nereus Program (the project I’m working on, aka my day job) website.

BGE embed in a webpage, cool

 

Where you see this Baltic Visualization box, it’s a BGE application running. Cool, right? Before someone ask if this is all realtime let me explain. This is a fancy videoplayer made in the BGE to play videos (also made with Blender, but not necessarily). It’s all about point-click, animate, sync videos, … Next you can see the same file running in the BGE with Physics debug on.

Physics Visualization on

In order to have this going I had to:

  • pack all the textures in the file
  • open all the external scripts (originally in //scripts/) in the Blender Text Editor
  • remove all the ‘from . import’ from my scripts (the modules were calling each other)
  • fix all the python module controllers:
import bpy
for obj in bpy.data.objects:
  for cont in obj.game.controllers:
    if cont.type == 'PYTHON' and cont.mode == 'MODULE':
      cont.module = cont.module.replace('script.', '')

 

Note: not all modules/python functions are supported. Read the  are Security page in the Burster plugin site.

And I did all my tests locally. Mainly because I had to hardcode the address of the videos in my harddrive. I believe it may be (or it will at some point) possible to load videos from the server. I’m yet to find the right solution for this.

For questions on Burster please refer to their website 😉

Cheers,
Dalai

A quick proof of concept test for the following technique:

  • environment re-mapped into the couch (sofa) – using IBL Toolkit
  • shape-key deformed and animated geometry

The final goal is to have the mapping handled internally by the render engine (Luxrender, although I used Cycles for this test). That way the light support geometry (e.g. the ceiling) can be transformed as well and the shadows should dance around.

They are two key elements here, to help  to produce this effect.

Sofa modeled using the background plate as reference

First of all we need to model the sofa geometry. For that we need to project the background in the 3d view as it will look in the final render. This is accomplished with a GLSL filter running on top of the 3dview.

Sofa UV mapped to the background panorama

The second part is to project the UV to match the original image. This would be really tricky if the object were in the edge of the image, but for this case is more doable. Both those problems are handled by IBLToolkit.

(and yes, I know the animation could be better and I could map only one of the pillows. This is a quick tech test though ;))

Render with ARLuxrender (branch of luxrender). A small teaser from a paper I’m writing with Aldo Zang (arlux developer). Try to guess what is real and what is fake here 😉

Render of the week - Blender + luxrender (arlux)

 

Background Plate + Lighting - captured environment

 

Against all the odds the sofas are real (so far it seems that most people tend to think they are the 3d elements). The carpet is pure 3D though (well, based of a 2D image actually). The spheres are also 3d, but we have a lot of the scene elements modeled as “support meshes”. That’s how the spheres can get the right reflection and lighting.

Also can you see what the carpet is covering ?
Ah, and this image has no post-processing or compositing on it. It comes blended with the real elements straight from the render.

Cheers,
Dalai

* updated on 23.05.212 *

I’m writing an addon to help with scene reconstruction and lighting with IBL files. It also works as a handy workflow to expand panoramas with rendered elements.

It’s still in its Beta version and I’m evaluating it in production only now, so things will likely change.

quick render test

However if you want to take a first glance at it you will need:

Some screens:

blender, top right cycles rendering background, top left addon to add the background with glsl shader

ARLuxrender at work

Note: my goal is to use arluxrender as the final renderer, but all the modelling and editing is to be done inside Blender.

The original teaser with old screenshots can be found here: http://www.dalaifelinto.com/?p=377

For further discussions you can visit the Blender Artists forum thread as well.

 

If you do some real testing with it, please let me know your thoughts.

Dalai

Project developed with Aldo Zang at the Visgraf lab

Not only of domes can an artist leave of. The fisheye mode shown in the previous post is sufficient for planetarium productions, but may not satisfy artists looking for ‘real’ fisheye lens. The most common lens found in today’s market are ‘equisolid’ lens (ref: HDRI for CGI).

After a productive holiday we now have something new to play with 😉

model by Jar-Artist

model by Jar-Artist

Models/scene gently provided by Jar-Artist
http://www.blendswap.com/blends/author/jay-artist/

 

The proof is in the pudin

It’s “easy” to put up a nice artistic effect together and simply assume everything is working as it should. However, I wanted to make a system that could match the effect produced by real lens. For that I had to build a fail proof experiment. People not into coding may not know, but building a reliable testing and debugging setup is one of the keys for efficient coding. In my opinion it’s always worthy to put time into this. Note: this is also why Blender users can help a lot with bug fixing by simply building proper test files for the (also carefully/methodologically) reported bugs).

1 – Photoshooting

Take some pictures with a tripod rotating the camera on its center (the focal centre actually). We have been doing this for the past two weeks so it was smooth. Those pictures were taken by Aldo Zang in the Visgraf Lab at IMPA.

2 – Stiching

I don’t get tired of recommending Hugin for stitching and panorama making – hugin.sourceforge.net. This open source project sometimes works better even than autopano pro (a pretty good commercial alternative).

 

3 – Rendering

I put the panorama as a background plate, calibrated the aligment, added a simple floor + spheres. This was done with the (yet to be released) IBL Toolkit. Apart from that my Blender camera needs to match the settings of the real camera+lens I’m aiming at.

In this case all the pictures for the stitching were taken with a Nikon DX2S and a fisheye 10.5mm lens. I created new presets for the sensor dimensions and the render output.

4 – Results

I was quite pleased when I compared the rendered output with the original image. The aspects we should be looking at are only field of view and line distortion across the lens:

 

Also note the bottom left corner of the photo. This subtle shadowing is due to the vignetting of the lens. This is not present in the cycles render because I’m not implementing a real camera model (as shown here and here).

Algorithm

The complete patch can be found here. The core is the following function (simplified here). I elaborated this from the ‘classic’ fisheye equisolid formula: radius = 2 * focallens * sin ( angle / 2):

__device float3 fisheye_equisolid_to_direction(
float u, float v, float lens, float width, float height)
{
    u = (u - 0.5) * width;
    v = (v - 0.5) * height;

    float r = sqrt(u*u + v*v);
    float phi = acos(u/r);
    float theta = 2 * asin(r/(2 * lens));

    if (v < 0) phi = -phi;

    return make_float3(
        cos(theta),
        -cos(phi)*sin(theta),
        sin(phi)*sin(theta)
)

I hope you like it. If you can share any work you did with this feature I would love to see it.
Dalai

What if we could render fisheye images directly from Blender? Yesterday I found out about the Equirectangular mode in Cycles. It got me quite thrilled (it’s been awhile since I was waiting for that).

This is only possible because Cycles is a full ray tracer render engine. Every pixel in the image is generated from a ray coming from the camera to anywhere in the scene. Enough talking. A quick hack in the code and tcharan:

 

IBL background plate + ibl toolkit (alignment addon) + cycles 'use panorama' + fisheye patch

And the nice thing is, it previews in 3D just as well:

What comes next? I will talk with Brecht to see if there is any pending design to have this implemented as another camera option. I would like to have an option to set the angle (so we don’t need to do only 180 degrees fisheyes). And to toggle between hemispherical and angular fisheye modes.

I you compile your own Blender and want to try the patch, get it here or:

Index: intern/cycles/kernel/kernel_montecarlo.h
===================================================================
--- intern/cycles/kernel/kernel_montecarlo.h    (revision 45899)
+++ intern/cycles/kernel/kernel_montecarlo.h    (working copy)
@@ -215,13 +215,29 @@
 
 __device float3 equirectangular_to_direction(float u, float v)
 {
+   u = (u - 0.5f) * 2.f;
+   v = (v - 0.5f) * 2.f;
+
+   float r = sqrt(u*u + v*v);
+   float theta = acosf(u/r);
+
+   if (v < 0.f) theta = -theta;
+
+   return make_float3(
+       sqrtf((1.f - r*r)),
+       -cosf(theta)*r,
+       sinf(theta)*r
+   );
+
+
+/*
    float phi = M_PI_F*(1.0f - 2.0f*u);
    float theta = M_PI_F*(1.0f - v);
-
    return make_float3(
        sin(theta)*cos(phi),
        sin(theta)*sin(phi),
        cos(theta));
+*/
 }
 
 /* Mirror Ball <-> Cartesion direction */

 


Dalai
* IBL from HDR Labs
* IBL Toolkit explained here.

There is a new addon landing. If you work with IBL in Blender for modeling come by soon. In the mean time enjoy the teaser (or poke me to provide some feedback and perhaps even join the alpha testing period).

IBL autosetup for Cycles and Luxrender

IBL Re-Aligment

Home made panoramas also work 😉

Point projection for calibration tweaking

Ideas, questions, comments, feel free to drop a line 😉