Introduction

Sometimes when working with architecture visualization we want a material to be seamlessly repeateable and to set its size based on the material real dimension.

For instance, let’s say we have a photo of a wood texture which corresponds to 2.0 x 0.1 meters.

If we want to re-use this texture for different objects we can’t rely on UV Coordinates to guarantee the correct real world dimensions.

So, how to do it?

To get this properly rendered you can use a node group that I prepared just for that:

  • Download this [sample .blend]
  • Import and add the “Architecture Coordinates” node group to your material
  • Link it to a Mapping node, with Scale: 2.0 (X) 0.1 (Y)
  • Link the Mapping node to your Image Texture node

Optionally you can change the Location (X, Y) and Rotation (Z) of the Mapping node.

Note, for this to work the object scale should be 1, 1, 1.

Incorrect Textures

Incorrect Textures 🙁

Correct Textures

Correct Textures 🙂

Sample File Explained

Note, the sample file requires you to Run Python Scripts for the drivers

screen

This file has a cube object which has its mesh controlled by hooks. And the hooks are driven by custom properties of the “Origin” empty. This way you can play with different values without changing the object scale (which would affect the final result).

The test image has a 2 x 1 aspect ratio. If we pretend it was originally a 4.0 x 2.0m texture the whole image will be seen when the width and height of the cube are 4 and 2 respectively.

The Architecture Coordinates Node group take the Object coordinate and tranform it based on the facenormal (i.e., whether the face is facing the X, Y or Z axis).

 

Tcharan! The texture is properly setup regardless of the face direction.

I hope you find this useful, and you have a diferente solution for this problem please let me know. Maybe this is something Cycles should have by default?

Note: This file was developed for Blender 2.77, it may not work in different versions

Dear visitor, welcome!

This week I visited the Blender Institute and decided to wrap up the multiview project. But since I had an Oculus DK2 with me I decided to patch multiview to support Virtual Reality gadgets. Cool, right?

Oculus DK2 and Gooseberry

Gooseberry Benchmark viewed with an Oculus DK2

There is something tricky about them. You can’t just render a pair of panoramas and expect them to work. The image would work great for the virtual objects in front of you, but it would have the stereo eyes swapped when you look at behind you.

How to solve that? Do you remember the 3D Fulldome Teaser? Well, the technique is the exactly same one. We start by determining an interocular distance and a convergence distance based on the stereo depth we want to convey. From there the software (Cycles) will rotate a ‘virtual’ stereo camera pair for each pixel to be rendered, so that both cameras’ rays converge at the specified distance.

gooseberry-oculus

Oculus barrel correction screen shader applied to a view inside the panorama

This may sound complicated, but it’s all done under the hood. If you want to read more about this technique I recommend this paper from Paul Bourke on Synthetic stereoscopic panoramic images. The paper is from 2006 so there is nothing new under the Sun.

If you have an Oculus DK2 or similar device, you can grab the final image below to play with. I used Whirligig to visualize the stereo panorama, but there are other alternatives out there.

Gooseberry Benchmark Panorama

Top-Bottom Spherical Stereo Equiretangular Panorama – click to save the original image

This image was generated with a spin off branch of multiview named Multiview Spherical Stereo. I’m still looking for a industry standard name for this method. But in the meanwhile that name is growing on me.

I would also like to remark the relevance of Open projects such as Gooseberry. The always warm-welcoming Gooseberry team just released their benchmark file, which I ended up using for those tests. To be able to get a production quality shot and run whatever multi-vr-pano-full-thing you may think of is priceless.

Builds

If you want to try to render your own Spherical Stereo Panoramas, I built the patch for the three main platforms.

* Don’t get frustrated if the links are dead. As soon as this feature is officially supported by Blender I will remove them. So if that’s the case, get a new Blender.

How to render in three steps

  1. Enable ‘Views’ in the Render Layer panel
  2. Change camera to panorama
  3. Panorama type to Equirectangular

And leave ‘Spherical Stereo’ marked (it’s on by default at the moment). Remember to post in the comments the work you did with it!

 

Last and perhaps least is the small demo video above. The experience of seeing a 3D set doesn’t translate well for the video. But I can guarantee you that the overall impression from the Gooseberry team was super positive.

Also, this particular feature was the exact reason I was moved towards implementing multiview in Blender. All I wanted was to be able to render stereo content for fulldomes with Blender. In order to do that, I had to design a proper 3D stereoscopic pipeline for it.

What started as a personal project in 2013 ended up being embraced by the Blender Foundation in 2014, which supported me for a 2-month work period at the Blender Institute via the Development Fund. And now in 2015, so close to the Multiview completion, we finally get the icing on the cake.

No, wait … the cake is a lie!

Links

  • Multiview Spherical Stereo branch [link] *
  • Multiview: Cycles Spherical Stereo Support Official Patch [link] *
  • Gooseberry Production Benchmark File [link]
  • Support the Gooseberry project by signing up in the Blender Cloud [link]
  • Support further Blender Development by joining the Development Fund [link]

* Time traveller from the future, hi! If the branch doesn’t exist anymore, it means that the work was merged into master.

Nice Oculus

Thanks! This is not mine though 🙂 Oculus is one of the supported platforms of the Blender-VR project, to be presented at the IEEEVR 2015 next week.

If you are interesting in interactive virtual reality and need an open source solution for your CAVE, multiple Oculus or video wall, give Blender-VR a visit. I’m participating in the development of a framework built on top of the Blender Game Engine.

Also if Oculus feels like sending me my own Oculus, I wouldn’t mind. If you do, though, consider sending one to the Blender Foundation as well. I will feel bad when I take the device away from them next week.

Have a good one,
Dalai

Update:

Due to the long review process the patch is not yet in Blender. That said, since there were enough people interested on this feature, I just updated the links above with a more recent build (on top of current Blender 2.76 RC3).

Update:

The build now also supports regular perspective cameras. This is required for cube map vr renders. For this I also recommend an addon that I was commissioned to build, to render or to simply setup cubemap renders [link].

Note: remember to change your camera pivot to center.

* Last build update: October 2nd 2015

Baking is a popular ‘technique’ to flat down your shading work into easy to use images (textures) that can be applied to your 3d models without any concerns with lighting calculation. This can help game development, online visualization, 3d printing, archiviz animations, and many other fields.

bake-koro

Koro, from Caminandes project, fully baked

Since last September I’ve been working part time for the Blender Foundation to help implementing game related features in Blender. So far I worked on bug fixes and a few nice features such as: improvements in the Triangulation Modifier, Photoshop PSD support and Walk Navigation System. Then comes December, and with it the possibility of tackling something new. We decided it was time to give baking a go.

Supported Maps

The Cycles renderer is based on physics based lighting calculations. That means the passes we can bake in Cycles are different than what you may be used to in the Blender Internal renderer.

Data Passes

  • Normal
  • UV
  • Diffuse/Glossy/Transmission/Subsurface/Emit Color

Light Passes

  • AO
  • Combined
  • Shadow
  • Diffuse/Glossy/Transmission/Subsurface/Emit Direct/Indirect

koro_AO

Koro Ambient Occlusion Bake Map

koro_COMBINED

Koro Combined Bake Map

The above maps illustrates Ambient Occlusion and Combined baking. Ambient Occlusion can be used to lit the game scene, while combined emulates what you get out of a full render of your object, which can be used in shadless engines.

The character baked here is Koro from the Caminandes project. Koro was gently made available as CC-by, so while I take no credits on the making of it, I did enjoy supporting their project and using Koro in my tests. Koro and all the other production files from Caminandes Gran Dillama are part of the uber cool USB customized card  you can buy to learn the nitty-gritty of their production, and to help supporting the project and the Blender Foundation.

Open Shading Language

Open Shading Language (OSL) is a shading language created and maintained by Sony Image Works and used by them in many blockbusters already (Amazing Spider Man, MIB III, Smurfs 2, …). It’s a great contribution from Sony to the industry, given that it was released in a permissive license, free to be implemented, and expanded by any interested party.

Blender was the first 3d package outside of Sony to officially support OSL, and since November 2012 we can use OSL in a “Script Node” to create custom shaders. Blender uses OSL via Cycles. The “Script Node” was implemented by Brecht, Lukas, Thomas and … me (:

Thus, with baking support in Cycles we get for “free” a way to store the shaders designed with it. In the following example you see the Node Cell Noise sample script from OpenShading.com. So even if your game engine has never heard of OSL, you can still benefit from it to make your textures and materials look more robust. How cool is that?

Open Shading Language Baking

Open Shading Language Baking

I Want to Try It

There are no official builds of this feature yet. However if you are familiar with git and building Blender, you can get it from my github repository. Clone the bake-cycles branch from the blender-git repository. Once you build you need to UV Unwrap the object you want to bake, select it and run the following script:

import bpy
bpy.ops.object.bake(type='COMBINED', is_save_external=True, filepath="/tmp/baked.png", width=512, height=512)

If you can’t build your own Blender get a build on GraphicAll.org. You can also follow my Blender Foundation Weekly Report to learn about the progress of this feature and to be informed on when the work will be ready and merged upstream in the official Blender repository.

Missing Bits

There is still more work ahead of this project. Cycles Baking is actually a small part of a big planned baking refactor in Blender, which includes Baking Maps and Cage support. We only decided for Cycles baking to be a start point because the idea was to use Cycles to validate the proposed refactor of the internal baking API.

That means Cycles Baking may or may not hit Blender on its own any soon. There are bugs to be fixed, loose ends to be tied, so it’s not that I’m spending time anxiously wondering about when this will land anyways (;

I would like to take the opportunity to thank Brecht van Lommel for all the help along this project, and the Blender Foundation for the work opportunity. I’m glad to be involved in a high impact project such as the Blender development.

Last but not least. If you work professionally with Blender and can benefit from features like this, consider donating to the Blender Foundation via the Development Fund page.

Best regards,
Dalai Felinto