It is with a great weight in my heart that I have to announce the decease of my external Mac keyboard :/

I was calling an online store to complain about my air conditioner purchase (Summer is on us already) and I ended up dropping some chocolate ice-cream on top of the keyboard. I could salvage what was left of the ice-cream, however I can no longer type 7UJM in the keyboard.

I may or may not use some of these keys for 99% of my passwords, so it was time to retire my old keyboard. Truth be told, I had it for way too long. Since after the Cosmic Sensation project I felt in love for it. Considering that this was back in 2010, and the keyboard was super bendy, it was about time.

So why to talk about it? Leaving the mourning aside, typing in a laptop with no external keyboard is tricky. The touchpad tends to get on the way more often than not. Specially in my case since for some reason the touchpad is particularly bumpy.

Read More →

Are you stuck on Microsoft Visual Studio, with Window 10 Anniversary Edition and missing QTCreator features such as rename refactoring?

msvc_refactor

Just like QT Creator, I need it, and I want it now!

I found an interesting free extension of MSVC named “Visual C++ Refactoring”. You can get it here.

msvc_extension

Easy, right? However if you got an error message because your .NET Framework has a different version hear me out:

Read More →

Match made in e-heaven

Story originally published in blender.org in July 28th, 2016.

Meet e-interiores. This Brazilian interior design e-commerce startup transformed their creation process into an entire new fashion. This tale will show you how Blender made this possible, and how far we got.

We developed a new platform based on a semi-vanilla Blender, Fluid Designer, and our own pipelines. Thanks to the accomplished results, e-interiores was able to consolidate a partnership with the giant  Tok&Stok providing a complete design of a room, in 72 hours.

Read More →

Introduction

Sometimes when working with architecture visualization we want a material to be seamlessly repeateable and to set its size based on the material real dimension.

For instance, let’s say we have a photo of a wood texture which corresponds to 2.0 x 0.1 meters.

If we want to re-use this texture for different objects we can’t rely on UV Coordinates to guarantee the correct real world dimensions.

So, how to do it?

To get this properly rendered you can use a node group that I prepared just for that:

  • Download this [sample .blend]
  • Import and add the “Architecture Coordinates” node group to your material
  • Link it to a Mapping node, with Scale: 2.0 (X) 0.1 (Y)
  • Link the Mapping node to your Image Texture node

Optionally you can change the Location (X, Y) and Rotation (Z) of the Mapping node.

Note, for this to work the object scale should be 1, 1, 1.

Incorrect Textures

Incorrect Textures 🙁

Correct Textures

Correct Textures 🙂

Sample File Explained

Note, the sample file requires you to Run Python Scripts for the drivers

screen

This file has a cube object which has its mesh controlled by hooks. And the hooks are driven by custom properties of the “Origin” empty. This way you can play with different values without changing the object scale (which would affect the final result).

The test image has a 2 x 1 aspect ratio. If we pretend it was originally a 4.0 x 2.0m texture the whole image will be seen when the width and height of the cube are 4 and 2 respectively.

The Architecture Coordinates Node group take the Object coordinate and tranform it based on the facenormal (i.e., whether the face is facing the X, Y or Z axis).

 

Tcharan! The texture is properly setup regardless of the face direction.

I hope you find this useful, and you have a diferente solution for this problem please let me know. Maybe this is something Cycles should have by default?

Note: This file was developed for Blender 2.77, it may not work in different versions

SPOILER ALERT: The conclusions I reached here are wrong. Big time wrong. According to the internet there is no convergence distance (infinite) or there is a 1.3m value. That said, carry on if you want to read me babble on math …

The following text is rather dull and technical. I’m basically dissecting the projection matrix I get from the Oculus SDK in order to guess which convergence distance is being used. Making long short I found that for my setup, with the eye separation (interocular distance) of 6.5cm, the convergence distance is 4m.

This is twice as much as the classic “rule of thumb” of having a convergence distance 30x than the interocular distance.

Read More →

What if you want to copy and paste a text back and forward from Blender and your operating system? Blender has limited integration when it comes to the Font objects, and unfortunately none of the workarounds was satisfying for my picky taste.

So, what do you do when you are your own boss and want to use this inexistent functionality in Blender? Well, you just stop doing everything else and hack the hell out of Blender’s code 🙂

Blender Copy and Paste

Back in Blender 2.49 (around 2009) we could copy/paste the text from either the system clipboard or the internal (per object) text buffer. The reason behind this design was to allow for copy/paste of special formatting (e.g., bold, underline, …) when using it in Blender.

Seven years later in the latest Blender (2.76) this functionality (system clipboard) is not even available or exposed to the user. To fix this I unified the old system clipboard and the internal text buffer functionalities. Thus if you copy/paste a text from a font object it will be available in the system clipboard. And if the text was previously created within Blender, you will also get its original formatting.

Oh, did I mention it supports funky unicode characters? 😉

The patch is still under development and waiting for peer review. But it should be ready to merge in master any time soon.

Update: The patch was committed, and it will be part of the upcoming Blender 2.77.

python-console

Have you ever found yourself needing to change a .blend file remotely and VNC / Remote Desktop is not working?

In my case I finished rendering the left eye of a video, and wanted to do the same for the right one. I did it in parts due to the memory limit of the rendering station. And VNC is not working because … no idea. But it’s Friday and I won’t have physical access to the remote computer until next week.

Blender Interactive Console to the rescue!

$ ssh MY_REMOTE_COMPUTER
$ blender -b MYFILE.blend --python-console
(...)
(InteractiveConsole)
>>> import bpy
>>> bpy.context.scene.render.views['left'].use = False
>>> bpy.context.scene.render.views['right'].use = True
>>> bpy.ops.wm.save_mainfile()

Now all you need to do is resume your tmux session, and kick-off render once again. For different Blender command-line options try blender –help.

This post is obviously based on real events! Have a nice weekend 😉

If you read my blog you will know that I’m repeating myself. I can’t help stressing this enough though.

Parts of the challenge of stereo movie making is to work in 3d as soon as possible in your pipeline. This is the main reason the Multi-View implementation ranges from the 3D viewport all the way to the sequencer.

Grease Pencil and Oculus with Blender

VR (Virtual Reality) movie making is no different. Even more so, if we consider the uniqueness of the immersive experience.

So what if … What if we could preview our work in VR since the first stroke of the storyboard?

Here I’m demoing the Oculus Addon I’ve been developing as part of an ongoing research at a virtual reality lab in Rio (Visgraf/IMPA).

Notice that I’m not even drawing in VR. i’m merely experiencing the work done by Daniel “Pepeland” Lara in his demo file.

The applications of this addon are various, but it mainly focus on support HMD (head mounted displays) in the Blender viewport.

At the moment the support is restrict to Oculus (Rift and DK2), and it excels on Windows since the fastest direct mode is only supported on Oculus’s latest (Windows-only) SDK.

Links: