bbc_03Not only beautiful, but one of the few places in the world where you can bike around, enjoy the view and do some coding without any safety worries. I will certainly miss that.

Now, what an interesting timing. In Brazil riots are taken place for reasons I not only vouch for, but would love to take part on. If your local news is not covering any of those, do check the links below for more information.

As a curious contrast, in Vancouver the last riots were thanks to … Stanley Cup (hockey competition). Were they protesting against the abusive expenses in renewal stadiums, privatizing the public space, the lack of counterpart for the population? No my friends, the “protests” were due to the poor performance of the Canucks in the cup. What?!? Yup, go figure.

Meanwhile in Brazil:

For a “tinkering” developer there is no satisfaction like trying your code into production and have it working out of the box. I’ve been coding the 3-D stereo support for the multiview branch with no stereoscopic display, so today was the first time I could see it in action… and it works 😉

I tested top-bottom, side-by-side and interlaced (windowed and fullscreen). For interlaced windowed mode the “swap left-right images” is particularly important.

Caminandes - 3-D still courtesy of

Caminandes – 3-D still courtesy of

The one thing I didn’t test is the pageflip functionality. I came to the realization that my laptop doesn’t support 120Hz displays. I heard it’s working though, so I’m at ease.

I would like to express my gratitude to Dr. Maria Lantin, director of the Stereographic 3D Centre of Emily Carr university of art + design for so kindly open the doors and let me play with their “toys”. The same goes for her lab research team,  in particular Alan Goldman, Denise Quesnel and Sean Arden for taking the time to show their latest (extremely cool) projects and to help setting me up with the 3-D display and projector. And last and not least, for my friend Dr. Barry Po for connecting me with them (thanks Barry!).

For my own records: with the BenQ projector they are using a 3D-XL 3D Projector Adapter from Optoma to convert side-by-side or top-bottom inputs to time-sequential format.

And if you like the Llama, make sure to check the short movie Caminandes.

Related Links:

I got a bit tired of the back end coding for the Multi-View branch and decided to tackle the frontend for a change.


Anaglyph Mode for 3-D (Multi-View) – Los Padres OpenEXR [link]

Stereo Display Options

Stereo Display Options

Blender has a very modern drawing system (nicknamed Triple Buffer) which takes control over the buffer swapping routines (instead of relying on the graphic card Front/Back Buffer handling). That allows Blender to redraw the UI really efficiently. That also made the front-end implementation a breeze.

Now in User Preferences you can set the 3-D display you will be using with Blender. At some point I may make it a per-window option, but for now it will affect all the opened windows.

Next thing you need is a 3-D (Multi-View) image. You can simply render your own images (make sure the RenderViews are named “left” and “right”) or download a few OpenEXR samples.

3-D View


With no 3-D display set, when you open a 3-D image you should see the views in the Image Editor drop-down. When any 3-D display is set, however, you will see a new “3-D” option. Once this is set, you can take full advantage of your 3D-gear.



The following are samples from the other current display options. Be aware that the image I’m using doesn’t converge in a nice stereo 3-D photo. It’s in fact intentionally produced to show very different images, to make sure the code is working (programmers, go figure).













And not that you asked, but this was a great weekend for my 3-D philia. Iron Man 3 was a nice movie, and yesterday I attended two seminars at the SID – Display Week 2013 which turned to be quite inspiring talks with the addition of seeing some jaw-drop 3-D displays. I’m actually going there again tomorrow for the exhibit booth to see if I can clarify some pending questions that I have. I guess I should thank Queen Victoria for the long weekend 😉

Related Links:

I’ve been writing too much about coding, programming and other cool boring topics 😉

For a change I decided to share my current “desktop image”. This was part of the visualization I built for the AAAS 2013, for the Nereus Program sections.


The curves, spheres and the conformance of the visuals to an invisible translated-sinoide were all done with a Python addon I wrote for Blender. I will write more about this project one day.

The slightly yellow background (in oppose to pure white) is thanks to the great book: Visual Strategies: A Practical Guide to Graphics for Scientists and Engineers” by Felice C. Frankel and Angela H. Depace.

This video showcases the current snapshot of the multiview branch I’ve been working on.

Source Code:

Original Proposal:

For follow ups in the development I will keep posting in the bf-committers mailing list. But I will try to keep my blog up to date as well.

If you like my desktop background, this is the cover of my upcoming “Game Development with Blender” book with Mike Pan. The book is its final revision stage (checking the final pdfs about to be printed) and should be shipped soon. The pre-sale campaign on Amazon is still on-going.

Have a good day!


Related Links:

What if we could work in a stereoscopic 3d animation and wanted to preview the work in the viewport? Hopefully that will soon be possible in Blender (in my version of Blender at least :p ).

Click to see the complete interface

Click to see the complete interface – model courtesy of

After finishing the dome-stereo support (I got it all working now ! 😉 ) in Cycles, I decided to investigate how would be to support stereo in perspective (non-fisheye) mode. The full disclaimer will come once things are in a better shape, but I gotta say, it’s fun coding.

One thing I’m trying to figure out is what should be builtin in the Blender C code, and what should be implemented at an addon level. People have been doing stereo renders one way or another, so I’m confident that as long as we provide the bare-bones of stereoscopic support, they will be happy. In the links below there is actually a really nice addon for Blender 2.6.

mirrored test - BMW model courtesy of

Cycles Stereo 3D Render – Mirrored Display – BMW model courtesy of

I have mixed feelings about stereo-3d movies. But in the last Siggraph Asia I attended the most spectacular workshop on “Constructing a Stereo Pipeline from Sratch” by Disney Stereographer Vladimir Sierra. This changed the way I see 3d movies, and reinforce to me the importance of attending those conferences whenever possible. Even nicer when you go to present a project 🙂

That said, I never worked in a stereo-3d production and I don’t want to limit the possibilities here by my experience. To help with that I’m counting on 3D artist Francesco Siddi to help with designing a nice workflow proposal.

Coming up next: a crowd-funding to get me a real 3D Display :p
(kidding, though I wouldn’t mind)

My own reference links:

  • NVidia presentation on Siggraph 2011 on Stereoscopy (pdf)
  • 3D Movie Making by Bernard Mendiburu (book)
  • Cinema Stereoscopico by Francesco Siddi (book – Italian)
  • Blender 2.6 Stereoscopic Rendering Addon by Sebastian Schneider (link)
  • 3D Fulldome – Teaser (link)

Disclaimer: This tutorial is tailored towards the cluster system I have access to. The output from the commands is unique to each system, and the same goes for the cluster commands. Also, since OpenCL is still not fully supported in Blender this guide focus on CUDA rendering only. Anyways, use it wisely, and refer to your own cluster documentation for their equivalent of qsub, interactive testing, nvidia-smi, … feel free to drop a line if that helps you or if you have any complementing information.


Blender has no command-line way to set the render device Cycles should use. That means you may not be able to take full advantage of your GPU renderfarm out of the box.

This is not a problem if you use the same computer to save the file and to render it. However if you need to render your file in a remote computer (in this case a renderfarm) you may not have access to the same hardware available in the target computer. So you can’t rely on the ‘render device’ settings that are saved with your .blend.

Luckily we can use Python to set the proper rendering settings before producing your image.

Getting the list of GPUS

The ssh terminal I use for login doesn’t have access to any GPU, thus it doesn’t work for testing. Taking advantage of the qsub system available in the cluster, the first step is to ask for an interactive session to get a list of the available GPUs. In a ssh session in your cluster do:

$ qsub -q interactive -I -V -l walltime=01:00:00,nodes=1:gpus=3
qsub: waiting for job 125229.parallel-admin to start
qsub: job 125229.parallel-admin ready

At this point the shell terminal is back and we can ask Blender what devices it can see.
First copy the code below and paste in a new file called

import bpy, _cycles
bpy.context.user_preferences.system.compute_device = 'BLABLABLA'

To use this script in Blender, run:
$blender -b -P

Here this will return the following list: (‘Intel Xeon CPU’, ‘Tesla M2070’, ‘Tesla M2070’, ‘Tesla M2070’, ‘Tesla M2070 (3x)’)

And also the following intentional error:
TypeError: bpy_struct: item.attr = val: enum “BLABLABLA” not found in (‘CUDA_0’, ‘CUDA_1’, ‘CUDA_2’, ‘CUDA_MULTI_2’)

This “error” is only to let us know what is the internal name of the CUDA device I want to use. In my case I want to render with the ‘Tesla M2070 (3x)’ so I should set system.compute_device as ‘CUDA_MULTI_2’ (the error list and the output list have the same size and order, so it’s a one-to-one correlation).


Now  to create the actual setup script. Paste the following code in a new file called (replace CUDA_MULTI_2 by the device you want to use in your system):

import bpy
bpy.context.user_preferences.system.compute_device_type = 'CUDA'
bpy.context.user_preferences.system.compute_device = 'CUDA_MULTI_2'

And to use this script in Blender and test render your file:

$ blender -b Nave.blend  -o //nave_###.png -P ~/ -f 1

Remember, the order of the arguments matter. Any argument added after the -f will be parsed only after the render is over. For the complete list of available arguments visit the Blender Wiki.

For the final render you need to use the above command as part of a qsub job dispatching file (the -P part). Since Cycles doesn’t recognize all the cluster nodes as rendering devices, you need to split your job into batches, to have an instance of Blender running on every node. This is outside the scope of this guide though.

Just to be Sure

In case you think the GPUs may not be at use, you can do the following. First ssh connect in the interactive node you were using for the test. Next use the NVidia SMI program:

$ nvidia-smi


As you can see there is no GPU power been spared here. Sorry but you can’t bitcoin mine here 😉

Wrapping Up

This tutorial is intended for my own records, but also to help someone stuck in the same situation – though you will likely have to adjust something to your own setup. I tested this with Blender 2.65a running on a CentOS in a cluster based on the HP Proliant SL390 server architecture.

Happy 2013!


Hello all,
I’m pleased to announce that the latest version of Blender 3D is out. This is the collaborative effort of a team of developers, which I’m proudly a part of. The 2.65 edition is particularly relevant for the dome community due to a complete support for equidistant fisheye lens rendering.

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

That includes a series of fixes since last release, the most noticeable been the Equidistant Fisheye Lens fix, as I mentioned in an early post. This release not only benefit Fulldome artists, but also anyone willing to experiment with the Equisolid Fisheye lens. The image below is what you see from within the working viewport.

And how simple it is to use this? For those familiar with the builtin Cycles render engine this is as easy as it gets. For Equidistant fisheye lens all you need to do is to set the render dimensions to square (e.g., 1024 x 1024) and to set the field of view angle (usually 180°). For Equisolid fisheye lens you need to do is to set the lens size and one of the sensor dimensions. The other sensor dimension is taken from the render aspect ratio.

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio -

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio –

For the complete release information, please visit the official Blender 2.65 Release Log.

For the Fisheye Lens specific info, check:

Blender is under 60MB to download, free and more than capable to handle small to medium size productions. Go get it! I hope it can help more future fulldome productions in the future.

Enjoy it,

Err not really, backup often kids ;). To add it up to my recent tragic Windows 8 destroyed my OSX partitions story I actually have ordered an external hard drive but it arrived one day after my HD was messed up.

Now time for some good news: I just fully recovered my HD. I would like to thank all the help I got over here, G+, email. Really, thanks big time.

So that’s what I did (in case you are here googling about the same problem). I’m assuming that if you got that far in the internet for help, you are already as doomed as I was. So I won’t even bother with warning and re-warning you that you are better off seeking professional help.

1) 2012 is the year of the penguin, but not. But still, Linux still rocks. I have no Mac-to-Mac cable at hand so I had to resort to a rescue live CD. The party wouldn’t be complete without an OS-threesome after all.

So I first tried Ubuntu Rescue Mix but it didn’t come with the application that I was suggested to try (namely gdisk), so I used it only for diagnostic and the dump backup. In the end I downloaded GParted which comes with a nice X11 btw (though you will inevitably use the console, so don’t get pumped it up ;))

2) dd if=/dev/sda /media/usb7/sda_backup
First things first, jokes aside it’s never too late for a backup. I started with a byte backup of my 500GB. It takes time, as in, really. 30 hours later I knew there was nothing I could do to make things worse. If anything I could reverse the dd if/of order and restore my HD to the broken state I was getting familiar with. And there is nothing like Mario U to endure 30 more hours if it comes to that 😉

*) As for everyone’s suggestions I tried gdisk, but my system was too corrupted. It had no internal backup (what you get from using ‘c’ from gdisk) and the error message I got (something about overlapping partition sectors) was stopping me from doing any change in the partition and write it back. I thought it could be a conflict between the MBR Windows tried to write and the GTP created by Mac, so here for a more radical next step:

3) dd if=/dev/zero of=/dev/sda bs=512 count=1
WARNING: This will clean up the first 512 bytes of your partition, namely the boot sector. Note, this may not be necessary, but I was already in the verge of resetting my system. I went back to gdisk to see if that worked, and in fact the MBR was gone. I quitted gdisk because it wasn’t his turn anymore. Welcome the star of the night:

4) It’s fdisk magic time 🙂 I don’t know what to say. A few [enter]s and I had my HD back up and running. This console based program (that comes with GParted, OSX, and any Linux as far as I know) is great. It prints the partition types correctly, list all the files in the Windows partition, and even allowed me to copy them out if needed. Since my Windows had nothing special I turned to the OSX partition and, although file listing isn’t supported, just asking it to fix it solved the problem. I’m not kidding, it took me less than 5 minutes.

Final considerations: I just wished the personal at AppleCare phone line could have drove me through that. They pretty much got me hopeless to the point that I didn’t even bother dropping my computer at an AppleStore.  If I had the time I would get my broken backup back (from [2]) to the laptop to see if the Apple Store crew would have fixed it :p

And that’s all. I hope this turns useful to someone. And again, thanks everyone for the help I got. Very much appreciated. And yes this post has tons of smiles. That looks like me through the entire day 😀

I’m just back from the Siggraph Asia 2012. I was impressed by the people I met, the talks and courses I attended, and why not, the places I visited. Singapore is a very interesting city for a tourist. Among the people I met, a particular meeting was long overdue. I finally had a chance to meet Paul Bourke personally.

We collaborated (meaning he helped me ;)) in the fisheye implementation for the Blender Game Engine back in 2009. Since then there is not a fisheye related question that I don’t bounce by him first. So, in between talks he kindly shared his thoughts for stereoscopic rendering for domes. It took me a week to work around the problem, but here you can see the first real renders in a patched Blender with Cycles.

The formula is very simple, it’s just one of those problems that is really hard to debug (at least when you don’t have a dome with 3d projectors at hand). Thankfully in my flight back I had the peace of mind to wrap that up.

3D Model The White Room cortesy from Jay-Artist, shared on

As a teaser, this is all you get for now. More on that later 😉