Are you stuck on Microsoft Visual Studio, with Window 10 Anniversary Edition and missing QTCreator features such as rename refactoring?


Just like QT Creator, I need it, and I want it now!

I found an interesting free extension of MSVC named “Visual C++ Refactoring”. You can get it here.


Easy, right? However if you got an error message because your .NET Framework has a different version hear me out:

Read More →

Dear visitor, welcome!

This week I visited the Blender Institute and decided to wrap up the multiview project. But since I had an Oculus DK2 with me I decided to patch multiview to support Virtual Reality gadgets. Cool, right?

Oculus DK2 and Gooseberry

Gooseberry Benchmark viewed with an Oculus DK2

There is something tricky about them. You can’t just render a pair of panoramas and expect them to work. The image would work great for the virtual objects in front of you, but it would have the stereo eyes swapped when you look at behind you.

How to solve that? Do you remember the 3D Fulldome Teaser? Well, the technique is the exactly same one. We start by determining an interocular distance and a convergence distance based on the stereo depth we want to convey. From there the software (Cycles) will rotate a ‘virtual’ stereo camera pair for each pixel to be rendered, so that both cameras’ rays converge at the specified distance.


Oculus barrel correction screen shader applied to a view inside the panorama

This may sound complicated, but it’s all done under the hood. If you want to read more about this technique I recommend this paper from Paul Bourke on Synthetic stereoscopic panoramic images. The paper is from 2006 so there is nothing new under the Sun.

If you have an Oculus DK2 or similar device, you can grab the final image below to play with. I used Whirligig to visualize the stereo panorama, but there are other alternatives out there.

Gooseberry Benchmark Panorama

Top-Bottom Spherical Stereo Equiretangular Panorama – click to save the original image

This image was generated with a spin off branch of multiview named Multiview Spherical Stereo. I’m still looking for a industry standard name for this method. But in the meanwhile that name is growing on me.

I would also like to remark the relevance of Open projects such as Gooseberry. The always warm-welcoming Gooseberry team just released their benchmark file, which I ended up using for those tests. To be able to get a production quality shot and run whatever multi-vr-pano-full-thing you may think of is priceless.


If you want to try to render your own Spherical Stereo Panoramas, I built the patch for the three main platforms.

* Don’t get frustrated if the links are dead. As soon as this feature is officially supported by Blender I will remove them. So if that’s the case, get a new Blender.

How to render in three steps

  1. Enable ‘Views’ in the Render Layer panel
  2. Change camera to panorama
  3. Panorama type to Equirectangular

And leave ‘Spherical Stereo’ marked (it’s on by default at the moment). Remember to post in the comments the work you did with it!


Last and perhaps least is the small demo video above. The experience of seeing a 3D set doesn’t translate well for the video. But I can guarantee you that the overall impression from the Gooseberry team was super positive.

Also, this particular feature was the exact reason I was moved towards implementing multiview in Blender. All I wanted was to be able to render stereo content for fulldomes with Blender. In order to do that, I had to design a proper 3D stereoscopic pipeline for it.

What started as a personal project in 2013 ended up being embraced by the Blender Foundation in 2014, which supported me for a 2-month work period at the Blender Institute via the Development Fund. And now in 2015, so close to the Multiview completion, we finally get the icing on the cake.

No, wait … the cake is a lie!


  • Multiview Spherical Stereo branch [link] *
  • Multiview: Cycles Spherical Stereo Support Official Patch [link] *
  • Gooseberry Production Benchmark File [link]
  • Support the Gooseberry project by signing up in the Blender Cloud [link]
  • Support further Blender Development by joining the Development Fund [link]

* Time traveller from the future, hi! If the branch doesn’t exist anymore, it means that the work was merged into master.

Nice Oculus

Thanks! This is not mine though 🙂 Oculus is one of the supported platforms of the Blender-VR project, to be presented at the IEEEVR 2015 next week.

If you are interesting in interactive virtual reality and need an open source solution for your CAVE, multiple Oculus or video wall, give Blender-VR a visit. I’m participating in the development of a framework built on top of the Blender Game Engine.

Also if Oculus feels like sending me my own Oculus, I wouldn’t mind. If you do, though, consider sending one to the Blender Foundation as well. I will feel bad when I take the device away from them next week.

Have a good one,


Due to the long review process the patch is not yet in Blender. That said, since there were enough people interested on this feature, I just updated the links above with a more recent build (on top of current Blender 2.76 RC3).


The build now also supports regular perspective cameras. This is required for cube map vr renders. For this I also recommend an addon that I was commissioned to build, to render or to simply setup cubemap renders [link].

Note: remember to change your camera pivot to center.

* Last build update: October 2nd 2015

Baking is a popular ‘technique’ to flat down your shading work into easy to use images (textures) that can be applied to your 3d models without any concerns with lighting calculation. This can help game development, online visualization, 3d printing, archiviz animations, and many other fields.


Koro, from Caminandes project, fully baked

Since last September I’ve been working part time for the Blender Foundation to help implementing game related features in Blender. So far I worked on bug fixes and a few nice features such as: improvements in the Triangulation Modifier, Photoshop PSD support and Walk Navigation System. Then comes December, and with it the possibility of tackling something new. We decided it was time to give baking a go.

Supported Maps

The Cycles renderer is based on physics based lighting calculations. That means the passes we can bake in Cycles are different than what you may be used to in the Blender Internal renderer.

Data Passes

  • Normal
  • UV
  • Diffuse/Glossy/Transmission/Subsurface/Emit Color

Light Passes

  • AO
  • Combined
  • Shadow
  • Diffuse/Glossy/Transmission/Subsurface/Emit Direct/Indirect


Koro Ambient Occlusion Bake Map


Koro Combined Bake Map

The above maps illustrates Ambient Occlusion and Combined baking. Ambient Occlusion can be used to lit the game scene, while combined emulates what you get out of a full render of your object, which can be used in shadless engines.

The character baked here is Koro from the Caminandes project. Koro was gently made available as CC-by, so while I take no credits on the making of it, I did enjoy supporting their project and using Koro in my tests. Koro and all the other production files from Caminandes Gran Dillama are part of the uber cool USB customized card  you can buy to learn the nitty-gritty of their production, and to help supporting the project and the Blender Foundation.

Open Shading Language

Open Shading Language (OSL) is a shading language created and maintained by Sony Image Works and used by them in many blockbusters already (Amazing Spider Man, MIB III, Smurfs 2, …). It’s a great contribution from Sony to the industry, given that it was released in a permissive license, free to be implemented, and expanded by any interested party.

Blender was the first 3d package outside of Sony to officially support OSL, and since November 2012 we can use OSL in a “Script Node” to create custom shaders. Blender uses OSL via Cycles. The “Script Node” was implemented by Brecht, Lukas, Thomas and … me (:

Thus, with baking support in Cycles we get for “free” a way to store the shaders designed with it. In the following example you see the Node Cell Noise sample script from So even if your game engine has never heard of OSL, you can still benefit from it to make your textures and materials look more robust. How cool is that?

Open Shading Language Baking

Open Shading Language Baking

I Want to Try It

There are no official builds of this feature yet. However if you are familiar with git and building Blender, you can get it from my github repository. Clone the bake-cycles branch from the blender-git repository. Once you build you need to UV Unwrap the object you want to bake, select it and run the following script:

import bpy
bpy.ops.object.bake(type='COMBINED', is_save_external=True, filepath="/tmp/baked.png", width=512, height=512)

If you can’t build your own Blender get a build on You can also follow my Blender Foundation Weekly Report to learn about the progress of this feature and to be informed on when the work will be ready and merged upstream in the official Blender repository.

Missing Bits

There is still more work ahead of this project. Cycles Baking is actually a small part of a big planned baking refactor in Blender, which includes Baking Maps and Cage support. We only decided for Cycles baking to be a start point because the idea was to use Cycles to validate the proposed refactor of the internal baking API.

That means Cycles Baking may or may not hit Blender on its own any soon. There are bugs to be fixed, loose ends to be tied, so it’s not that I’m spending time anxiously wondering about when this will land anyways (;

I would like to take the opportunity to thank Brecht van Lommel for all the help along this project, and the Blender Foundation for the work opportunity. I’m glad to be involved in a high impact project such as the Blender development.

Last but not least. If you work professionally with Blender and can benefit from features like this, consider donating to the Blender Foundation via the Development Fund page.

Best regards,
Dalai Felinto

Recently a fulldome producer needed a solution to stabilize panorama footage and I ended up collaborating with Sebastian Koenig to make a free-gpl addon for Blender to accomplish the task.

There is a very nice post explaining how we came up with this project and the advantages of having a system like Blender Network around: [link]

The addon is on github:

Worth mentioning, this is pure stabilization based on keeping one point steady and the angle between the two points the same across the footage.

For more advanced tracking a more robust system would be needed (e.g., to select four floor points and a horizon point to give the footage always up and facing the same direction, or some damping system to allow some rotation, …). But basically the client was happy with the solution, thus so were we.

Here it is a video showing how to use the tool (@6:48 shows before/after)

Maybe in the future, with some further interest and funding this can be expanded to a more complete solution. Meanwhile if someone wants to expand the solution, you are welcome to contribute on github 😉

Addon implementation based on the original work developed last year on Visgraf/IMPA by a different project/team (D. Felinto, A. Zang, and L. Velho): [link].



This video showcases the current snapshot of the multiview branch I’ve been working on.

Source Code:

Original Proposal:

For follow ups in the development I will keep posting in the bf-committers mailing list. But I will try to keep my blog up to date as well.

If you like my desktop background, this is the cover of my upcoming “Game Development with Blender” book with Mike Pan. The book is its final revision stage (checking the final pdfs about to be printed) and should be shipped soon. The pre-sale campaign on Amazon is still on-going.

Have a good day!


Related Links:

I’m just back from the Siggraph Asia 2012. I was impressed by the people I met, the talks and courses I attended, and why not, the places I visited. Singapore is a very interesting city for a tourist. Among the people I met, a particular meeting was long overdue. I finally had a chance to meet Paul Bourke personally.

We collaborated (meaning he helped me ;)) in the fisheye implementation for the Blender Game Engine back in 2009. Since then there is not a fisheye related question that I don’t bounce by him first. So, in between talks he kindly shared his thoughts for stereoscopic rendering for domes. It took me a week to work around the problem, but here you can see the first real renders in a patched Blender with Cycles.

The formula is very simple, it’s just one of those problems that is really hard to debug (at least when you don’t have a dome with 3d projectors at hand). Thankfully in my flight back I had the peace of mind to wrap that up.

3D Model The White Room cortesy from Jay-Artist, shared on

As a teaser, this is all you get for now. More on that later 😉

In case Alice’s rabbit is right, it’s always good to have your eyes at the current time. I was never a fan of arm’s watches and I do use Blender in fullscreen (alt+F11). So what to do?

You are right, an addon to show the current time in the Info header. Download the file here, or copy and past the code below. Have fun.

bl_info = {
    "name": "Clock",
    "author": "Dalai Felinto (dfelinto)",
    "version": (1,0),
    "blender": (2, 6, 6),
    "location": "Info header",
    "description": "Shows the current time",
    "warning": "",
    "wiki_url": "",
    "tracker_url": "",
    "category": "Useless"}

import bpy
import time

def header_info(self, context):
    t = time.localtime()
    self.layout.label("%02d:%02d:%02d" % (t.tm_hour, t.tm_min, t.tm_sec))
def clock_hack(context):
    # hack to update UI
    for area in bpy.context.screen.areas:
        if area.type == 'INFO':

def register():

def unregister():

if __name__ == "__main__":

Today was a meditative day to once again celebrate the passage of our Sun across the exactly-ish point in the sky. Yup, I’m getting old, but hey  don’t we all? As my self-birthday gift I decided to face the hills from home to work with my laptop in the backpack (big thing for me, I would always take the bus when I need the laptop, which means almost everyday).

Equirectangular Color Map Twisted

To make the self-celebration even more complete, I decided to not only work around my physical healthy, but also to give my mind some food for thought. In the past week, professor Adriano Oliveira was troubleshooting the cycles fisheye camera for his fulldome audiovisual work. He noticed that the equidistant lens was far off the expected (for example, compare it with the BGE fisheye lens) and even the equisolid (which is great to simulate real fisheye lens (like Nikkon, Canon, …) wasn’t producing a perfect fisheye.

Equirectangular Color Map

We have been debating that, and in the end he found some nice online references for different formulas per lens type. So today I thought it was a good time to get down the code. What you see next is the comparison between the wrong and the corrected equidistant fisheyes, the equirectangular testmap I used (as known as Blender UV Color Grid ;)) and something I’m passionated about now: to use drawing softwares to do math. You can delete things, move them around, color them .. it’s perfect 😉



So have fun with the fixed “fulldome” mode. Remember, the mode can go beyond 360 degrees if you want to make some really cool images 😉

Saving trees and abusing my tablet 😉

Now, something nice … I met prof. Adriano last year in the BlenderPRO (Brazilian Blender Conference). He may have exaggerated, but he told me that the main reason he went to the BlenderPRO was to meet me. It seems that it definitely paid off. I’m not saying I wouldn’t fix this bug if someone else had reported it. But it’s much more interesting to work with someone you met.

And why am I saying that? Well, next week we have a new edition of the BlenderPRO. So if you can make to Brasilia, don’t think twice. It’s going to be an amazing event, and I hope to see you there.

And happy birthday to me 🙂

I’m reading a very interesting book on data visualization for science. (Visual Strategies by Felice C. Frankel & Angela H. Depace). The book was highlighted in last month’s Nature magazine, and is indeed highly recommended for scientists or science communicators like myself ;).

Today Mitchell asked me to look at a patch he was reviewing and adding his own changes. The patch by Angus Hollands re-organize the interface for debugging. Well, I couldn’t help giving a try at representing the numbers in a more direct way. I don’t how hard would be to implement those ideas, here are mockups only for the sake of my creative exercise. Thoughts?

Original proposal by Angus Hollands (agoose77) with changes from Mitchell Stokes (Moguri)

Why not visualize the percentage?

A more radical approach

A quick proof of concept test for the following technique:

  • environment re-mapped into the couch (sofa) – using IBL Toolkit
  • shape-key deformed and animated geometry

The final goal is to have the mapping handled internally by the render engine (Luxrender, although I used Cycles for this test). That way the light support geometry (e.g. the ceiling) can be transformed as well and the shadows should dance around.

They are two key elements here, to help  to produce this effect.

Sofa modeled using the background plate as reference

First of all we need to model the sofa geometry. For that we need to project the background in the 3d view as it will look in the final render. This is accomplished with a GLSL filter running on top of the 3dview.

Sofa UV mapped to the background panorama

The second part is to project the UV to match the original image. This would be really tricky if the object were in the edge of the image, but for this case is more doable. Both those problems are handled by IBLToolkit.

(and yes, I know the animation could be better and I could map only one of the pillows. This is a quick tech test though ;))