Hello all,
I’m pleased to announce that the latest version of Blender 3D is out. This is the collaborative effort of a team of developers, which I’m proudly a part of. The 2.65 edition is particularly relevant for the dome community due to a complete support for equidistant fisheye lens rendering.

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

That includes a series of fixes since last release, the most noticeable been the Equidistant Fisheye Lens fix, as I mentioned in an early post. This release not only benefit Fulldome artists, but also anyone willing to experiment with the Equisolid Fisheye lens. The image below is what you see from within the working viewport.

And how simple it is to use this? For those familiar with the builtin Cycles render engine this is as easy as it gets. For Equidistant fisheye lens all you need to do is to set the render dimensions to square (e.g., 1024 x 1024) and to set the field of view angle (usually 180°). For Equisolid fisheye lens you need to do is to set the lens size and one of the sensor dimensions. The other sensor dimension is taken from the render aspect ratio.

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio - www.patazstudio.com

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio – www.patazstudio.com

For the complete release information, please visit the official Blender 2.65 Release Log.

For the Fisheye Lens specific info, check:

Blender is under 60MB to download, free and more than capable to handle small to medium size productions. Go get it! I hope it can help more future fulldome productions in the future.

Enjoy it,
Dalai

I’m just back from the Siggraph Asia 2012. I was impressed by the people I met, the talks and courses I attended, and why not, the places I visited. Singapore is a very interesting city for a tourist. Among the people I met, a particular meeting was long overdue. I finally had a chance to meet Paul Bourke personally.

We collaborated (meaning he helped me ;)) in the fisheye implementation for the Blender Game Engine back in 2009. Since then there is not a fisheye related question that I don’t bounce by him first. So, in between talks he kindly shared his thoughts for stereoscopic rendering for domes. It took me a week to work around the problem, but here you can see the first real renders in a patched Blender with Cycles.

The formula is very simple, it’s just one of those problems that is really hard to debug (at least when you don’t have a dome with 3d projectors at hand). Thankfully in my flight back I had the peace of mind to wrap that up.

3D Model The White Room cortesy from Jay-Artist, shared on blendswap.com

As a teaser, this is all you get for now. More on that later 😉

Hello there.

I’m running some tests and came out with an interesting workaround for fulldome productions with Blender. I still can’t completely get away of the seams, but they are pretty much unnoticeable.

Equiangular Fisheye / Truetheta (most common for dome projection):

“Mirror-wise” Fisheye (most popular for photography):

More on different kind of lens/projections here:
http://local.wasp.uwa.edu.au/~pbourke/miscellaneous/domefisheye/fisheye/

the method:

  •  get a predeformed mesh UV mapped (sphere, plate, …)
  • UV Node (composite)
  • linked scenes with camera set to unique and rotated.

It will work nicely only with Blender 2.49a (for a Map UV node fix after 2.49).
To get a load of how it would work for real, I’m posting here the file I used as well. This File contains:

  1. MainScene – the main scene, it’s not being rendered. However this is the scene where the camera IPO is setup.
  2. MainSceneLeft/Right/Top/Bottom – Linked scenes with the a 90º FOV camera rotating to compose a cubemap.
  3. Uvs – a scene with the UV used to remap the forementioned scenes
  4. Composite – a node composition using the UVs calculated in of the scene to distort the scenes.

Test file:
http://www.dalaifelinto.com/blender/dome/dome_rendering.blend

Youtube videos:

Dear visitor, welcome!

This week I visited the Blender Institute and decided to wrap up the multiview project. But since I had an Oculus DK2 with me I decided to patch multiview to support Virtual Reality gadgets. Cool, right?

Oculus DK2 and Gooseberry

Gooseberry Benchmark viewed with an Oculus DK2

There is something tricky about them. You can’t just render a pair of panoramas and expect them to work. The image would work great for the virtual objects in front of you, but it would have the stereo eyes swapped when you look at behind you.

How to solve that? Do you remember the 3D Fulldome Teaser? Well, the technique is the exactly same one. We start by determining an interocular distance and a convergence distance based on the stereo depth we want to convey. From there the software (Cycles) will rotate a ‘virtual’ stereo camera pair for each pixel to be rendered, so that both cameras’ rays converge at the specified distance.

gooseberry-oculus

Oculus barrel correction screen shader applied to a view inside the panorama

This may sound complicated, but it’s all done under the hood. If you want to read more about this technique I recommend this paper from Paul Bourke on Synthetic stereoscopic panoramic images. The paper is from 2006 so there is nothing new under the Sun.

If you have an Oculus DK2 or similar device, you can grab the final image below to play with. I used Whirligig to visualize the stereo panorama, but there are other alternatives out there.

Gooseberry Benchmark Panorama

Top-Bottom Spherical Stereo Equiretangular Panorama – click to save the original image

This image was generated with a spin off branch of multiview named Multiview Spherical Stereo. I’m still looking for a industry standard name for this method. But in the meanwhile that name is growing on me.

I would also like to remark the relevance of Open projects such as Gooseberry. The always warm-welcoming Gooseberry team just released their benchmark file, which I ended up using for those tests. To be able to get a production quality shot and run whatever multi-vr-pano-full-thing you may think of is priceless.

Builds

If you want to try to render your own Spherical Stereo Panoramas, I built the patch for the three main platforms.

* Don’t get frustrated if the links are dead. As soon as this feature is officially supported by Blender I will remove them. So if that’s the case, get a new Blender.

How to render in three steps

  1. Enable ‘Views’ in the Render Layer panel
  2. Change camera to panorama
  3. Panorama type to Equirectangular

And leave ‘Spherical Stereo’ marked (it’s on by default at the moment). Remember to post in the comments the work you did with it!

 

Last and perhaps least is the small demo video above. The experience of seeing a 3D set doesn’t translate well for the video. But I can guarantee you that the overall impression from the Gooseberry team was super positive.

Also, this particular feature was the exact reason I was moved towards implementing multiview in Blender. All I wanted was to be able to render stereo content for fulldomes with Blender. In order to do that, I had to design a proper 3D stereoscopic pipeline for it.

What started as a personal project in 2013 ended up being embraced by the Blender Foundation in 2014, which supported me for a 2-month work period at the Blender Institute via the Development Fund. And now in 2015, so close to the Multiview completion, we finally get the icing on the cake.

No, wait … the cake is a lie!

Links

  • Multiview Spherical Stereo branch [link] *
  • Multiview: Cycles Spherical Stereo Support Official Patch [link] *
  • Gooseberry Production Benchmark File [link]
  • Support the Gooseberry project by signing up in the Blender Cloud [link]
  • Support further Blender Development by joining the Development Fund [link]

* Time traveller from the future, hi! If the branch doesn’t exist anymore, it means that the work was merged into master.

Nice Oculus

Thanks! This is not mine though 🙂 Oculus is one of the supported platforms of the Blender-VR project, to be presented at the IEEEVR 2015 next week.

If you are interesting in interactive virtual reality and need an open source solution for your CAVE, multiple Oculus or video wall, give Blender-VR a visit. I’m participating in the development of a framework built on top of the Blender Game Engine.

Also if Oculus feels like sending me my own Oculus, I wouldn’t mind. If you do, though, consider sending one to the Blender Foundation as well. I will feel bad when I take the device away from them next week.

Have a good one,
Dalai

Update:

Due to the long review process the patch is not yet in Blender. That said, since there were enough people interested on this feature, I just updated the links above with a more recent build (on top of current Blender 2.76 RC3).

Update:

The build now also supports regular perspective cameras. This is required for cube map vr renders. For this I also recommend an addon that I was commissioned to build, to render or to simply setup cubemap renders [link].

Note: remember to change your camera pivot to center.

* Last build update: October 2nd 2015

Recently a fulldome producer needed a solution to stabilize panorama footage and I ended up collaborating with Sebastian Koenig to make a free-gpl addon for Blender to accomplish the task.

There is a very nice post explaining how we came up with this project and the advantages of having a system like Blender Network around: [link]

The addon is on github:
http://github.com/dfelinto/Panorama-Tracker

Worth mentioning, this is pure stabilization based on keeping one point steady and the angle between the two points the same across the footage.

For more advanced tracking a more robust system would be needed (e.g., to select four floor points and a horizon point to give the footage always up and facing the same direction, or some damping system to allow some rotation, …). But basically the client was happy with the solution, thus so were we.

Here it is a video showing how to use the tool (@6:48 shows before/after)


Maybe in the future, with some further interest and funding this can be expanded to a more complete solution. Meanwhile if someone wants to expand the solution, you are welcome to contribute on github 😉


Addon implementation based on the original work developed last year on Visgraf/IMPA by a different project/team (D. Felinto, A. Zang, and L. Velho): [link].

Cheers,
Dalai

Links:

This video showcases the current snapshot of the multiview branch I’ve been working on.

Source Code: http://github.com/dfelinto/blender/tree/multiview

Original Proposal: http://wiki.blender.org/index.php/User:Dfelinto/Stereoscopy

For follow ups in the development I will keep posting in the bf-committers mailing list. But I will try to keep my blog up to date as well.

If you like my desktop background, this is the cover of my upcoming “Game Development with Blender” book with Mike Pan. The book is its final revision stage (checking the final pdfs about to be printed) and should be shipped soon. The pre-sale campaign on Amazon is still on-going.

Have a good day!

Dalai

Related Links:

What if we could work in a stereoscopic 3d animation and wanted to preview the work in the viewport? Hopefully that will soon be possible in Blender (in my version of Blender at least :p ).

Click to see the complete interface

Click to see the complete interface – model courtesy of patazstudio.com

After finishing the dome-stereo support (I got it all working now ! 😉 ) in Cycles, I decided to investigate how would be to support stereo in perspective (non-fisheye) mode. The full disclaimer will come once things are in a better shape, but I gotta say, it’s fun coding.

One thing I’m trying to figure out is what should be builtin in the Blender C code, and what should be implemented at an addon level. People have been doing stereo renders one way or another, so I’m confident that as long as we provide the bare-bones of stereoscopic support, they will be happy. In the links below there is actually a really nice addon for Blender 2.6.

mirrored test - BMW model courtesy of mikepan.com

Cycles Stereo 3D Render – Mirrored Display – BMW model courtesy of mikepan.com

I have mixed feelings about stereo-3d movies. But in the last Siggraph Asia I attended the most spectacular workshop on “Constructing a Stereo Pipeline from Sratch” by Disney Stereographer Vladimir Sierra. This changed the way I see 3d movies, and reinforce to me the importance of attending those conferences whenever possible. Even nicer when you go to present a project 🙂

That said, I never worked in a stereo-3d production and I don’t want to limit the possibilities here by my experience. To help with that I’m counting on 3D artist Francesco Siddi to help with designing a nice workflow proposal.

Coming up next: a crowd-funding to get me a real 3D Display :p
(kidding, though I wouldn’t mind)

My own reference links:

  • NVidia presentation on Siggraph 2011 on Stereoscopy (pdf)
  • 3D Movie Making by Bernard Mendiburu (book)
  • Cinema Stereoscopico by Francesco Siddi (book – Italian)
  • Blender 2.6 Stereoscopic Rendering Addon by Sebastian Schneider (link)
  • 3D Fulldome – Teaser (link)

Today was a meditative day to once again celebrate the passage of our Sun across the exactly-ish point in the sky. Yup, I’m getting old, but hey  don’t we all? As my self-birthday gift I decided to face the hills from home to work with my laptop in the backpack (big thing for me, I would always take the bus when I need the laptop, which means almost everyday).

Equirectangular Color Map Twisted

To make the self-celebration even more complete, I decided to not only work around my physical healthy, but also to give my mind some food for thought. In the past week, professor Adriano Oliveira was troubleshooting the cycles fisheye camera for his fulldome audiovisual work. He noticed that the equidistant lens was far off the expected (for example, compare it with the BGE fisheye lens) and even the equisolid (which is great to simulate real fisheye lens (like Nikkon, Canon, …) wasn’t producing a perfect fisheye.

Equirectangular Color Map

We have been debating that, and in the end he found some nice online references for different formulas per lens type. So today I thought it was a good time to get down the code. What you see next is the comparison between the wrong and the corrected equidistant fisheyes, the equirectangular testmap I used (as known as Blender UV Color Grid ;)) and something I’m passionated about now: to use drawing softwares to do math. You can delete things, move them around, color them .. it’s perfect 😉

 

 

So have fun with the fixed “fulldome” mode. Remember, the mode can go beyond 360 degrees if you want to make some really cool images 😉

Saving trees and abusing my tablet 😉

Now, something nice … I met prof. Adriano last year in the BlenderPRO (Brazilian Blender Conference). He may have exaggerated, but he told me that the main reason he went to the BlenderPRO was to meet me. It seems that it definitely paid off. I’m not saying I wouldn’t fix this bug if someone else had reported it. But it’s much more interesting to work with someone you met.

And why am I saying that? Well, next week we have a new edition of the BlenderPRO. So if you can make to Brasilia, don’t think twice. It’s going to be an amazing event, and I hope to see you there.

And happy birthday to me 🙂

Cosmic Sensation

This is a big project I worked on back in 2010. It was so big and so nice, that it deserves its own page in my site.

Cosmic particles, sensors, dome and Blender, how not to love it?

Also I often talk about this project, but I was yet to write it nicely in a post, with pictures, …
For the time been I put together this page with the links for videos, the presentation in the Blender Conference 2010, an article in the blendernation about it, an interview …

Cosmic Sensation

Photos on Flickr, at Mike Pan’s account

Click on the image above for more pictures (or for a small selection, check the images available in the BConf2010 slides).

Making long short

Cosmic Sensation : “Experiencing Cosmic Rays with Blender in a Fulldome”. Blender Game Engine visualization for a 3 night event in Utrecht (Netherlands). More details in the official site or this press-release. Work realized with digital artists Mike Pan and Martins Upitis.

Recording of the event


Small article explaining the project

http://www.blendernation.com/2010/09/28/experiencing-cosmic-rays-with-blender-in-a-fulldome/

The official website

http://www.experiencetheuniverse.nl/
http://www.cosmicsensation.nl/interview-with-dalai-felinto-blender-artist/

Blender Conference 2010 talk

Slides: http://www.blender.org/community/blender-conference/blender-conference-2010/speakers/#c7964
Presentation parte 1:
The YouTube ID of 7sQbJUOfedY#t=11m30s is invalid.
Presentation parte 2:


Presentation parte 3:

Hello there,

I’ll have to be brief because we are still in the middle of the conference and I need to rest for the last day tomorrow.

This year I have decided not to submit any presentation at the Blender Conference (it has been 3 years in a row presenting projects, I thought people should get a break from me). Nevertheless, I felt into the mistake of telling Ton that I’m going to give a talk + a workshop at the BlenderPRO in Brazil. I suggested that if he needed more content for the conference, I would gladly make translate my material for that conference.

I didn’t hear back from him (or may have missed something). And guess what I found yesterday in the conference schedule? You are right, I was officially responsible to give a 1 hour presentation about Python and Blender for artists that haven’t (or are scared of) touching code.

Yesterday I couldn’t really work in the presentation (I went to the Rainbow Warriors inauguration party – the new GreenPeace boat – really nice). So today I had one hour to put together the slides.

presentation slides (2.2MB)

crossword sample file /only the script (< 1MB)

Once again, I apologize for people that attended the talk and had to see a presentation done in such a rush (luckily I have been gathering examples from my previous projects). I hope people got something worth of their time there ;)

Extra Extra

Another file I forgot/didn’t have time to show is the “image_change” add-on. The file with the documentation and the script can be found here:

texture_imagechanger.zip (133 kb) + pencil_test_sample.zip (12 MB). It’s an addon to help pencil tests artists (i.e. to have 2d animation loops textures controlled automatically and animatable. I still have to update the file to use the change_frame callbacks, but that will be later (right now it’s using the “old” pydriver script link hack, worthy checking). The first take on this (pre-addon) was for the Detail Library Pencil Test.

Additionally, although worthy a separated post, I updated my file for offline dome rendering. I was showing it for some friends at the conference and may as well share here: fulldome_rendering_pak.blend (< 1MB).

I believe Blender is actually ready to have a python add-on to handle the whole dome rendering automatically. I have no time to explore this idea right now, but if someone wants to give it a try and need help let me know.

Any questions don’t hesitate in contacting me. And follow me on twitter: @dfelinto

Extra, this is the Dr. Epilepsy demo I mentioned: