Now the troubleshooting part:
In case Alice’s rabbit is right, it’s always good to have your eyes at the current time. I was never a fan of arm’s watches and I do use Blender in fullscreen (alt+F11). So what to do?
You are right, an addon to show the current time in the Info header. Download the file here, or copy and past the code below. Have fun.
This is a casual recording of the Map Range Node commit (available in Blender svn snapshots or in the future Blender 2.65).
We present different uses for this new node, including atmospheric effect and background blender composite techniques; and a neat post-production stereoscopic 3d depth based system (demonstrated in Tears of Steel). Also, we show the real time of the commit, and the feature available in another computer – we hope this encourage more people to build their own Blender builds.
This new Blender feature was elaborated during the days prior to the BlenderPRO 2012, in Brasília.
In this video you see in order of appearance:
Flag of Brazil – http://en.wikipedia.org/wiki/Flag_of_Brazil
BlenderPRO 2012 – http://www.blender.pro.br/2012 – @BlenderProBr
Dalai Felinto – www.dalaifelinto.com – @dfelinto
Roberto Roch – thisroomthatikeep.blogspot.com.br – @roberto_roch
Sebastian König – www.3dzentrale.com – @s_koenig
Daniel Salazar – www.patazstudio.com – @ZanQdo
I would like to thank the Fundação Banco do Brasil for believing and supporting the importance of Blender as a digital, artistic and social inclusion tool. They are a major partner (aka sponsor) of the BlenderPRO this year. Thanks to their substantial support we were able to gather so many Blender professionals in the same place, which, as I hope this video demonstrates, is the necessary fuel for collaborative development.
I also would like to personally thank the organization team of the BlenderPRO this year. The event hasn’t yet even started, but it’s already been a very interesting exchange of experience, ideas, …
Today was a meditative day to once again celebrate the passage of our Sun across the exactly-ish point in the sky. Yup, I’m getting old, but hey don’t we all? As my self-birthday gift I decided to face the hills from home to work with my laptop in the backpack (big thing for me, I would always take the bus when I need the laptop, which means almost everyday).
To make the self-celebration even more complete, I decided to not only work around my physical healthy, but also to give my mind some food for thought. In the past week, professor Adriano Oliveira was troubleshooting the cycles fisheye camera for his fulldome audiovisual work. He noticed that the equidistant lens was far off the expected (for example, compare it with the BGE fisheye lens) and even the equisolid (which is great to simulate real fisheye lens (like Nikkon, Canon, …) wasn’t producing a perfect fisheye.
We have been debating that, and in the end he found some nice online references for different formulas per lens type. So today I thought it was a good time to get down the code. What you see next is the comparison between the wrong and the corrected equidistant fisheyes, the equirectangular testmap I used (as known as Blender UV Color Grid ;)) and something I’m passionated about now: to use drawing softwares to do math. You can delete things, move them around, color them .. it’s perfect 😉
So have fun with the fixed “fulldome” mode. Remember, the mode can go beyond 360 degrees if you want to make some really cool images 😉
Now, something nice … I met prof. Adriano last year in the BlenderPRO (Brazilian Blender Conference). He may have exaggerated, but he told me that the main reason he went to the BlenderPRO was to meet me. It seems that it definitely paid off. I’m not saying I wouldn’t fix this bug if someone else had reported it. But it’s much more interesting to work with someone you met.
And why am I saying that? Well, next week we have a new edition of the BlenderPRO. So if you can make to Brasilia, don’t think twice. It’s going to be an amazing event, and I hope to see you there.
And happy birthday to me 🙂
Sometimes I have this coding itch, that doesn’t let me do anything until I’m done with developing an idea. I actually thought that I wouldn’t bother implementing the mockups I did earlier today. But in my way home I realized the implementation was actually trivial. With no further delays, the next screen is not a mockup, but a real screenshot from a patched Blender:
The complete patch is in the Blender Tracker, but the essence of the code is:
Any comments and feedbacks are still appreciated.
Time to sleep 😉
I’m reading a very interesting book on data visualization for science. (Visual Strategies by Felice C. Frankel & Angela H. Depace). The book was highlighted in last month’s Nature magazine, and is indeed highly recommended for scientists or science communicators like myself ;).
Today Mitchell asked me to look at a patch he was reviewing and adding his own changes. The patch by Angus Hollands re-organize the interface for debugging. Well, I couldn’t help giving a try at representing the numbers in a more direct way. I don’t how hard would be to implement those ideas, here are mockups only for the sake of my creative exercise. Thoughts?
In less than half an hour I’m presenting a talk at the Blender Conference 10th years anniversary edition. The presentation will be streamed online, and if you want to follow the presentation with the slides I will use, you can download it here.
Dalai Felinto⋆, Aldo Zang† and Luiz Velho†
⋆ Fisheries Centre, UBC – Vancouver, Canada † Visgraf Laboratory, IMPA – Rio de Janeiro, Brazil
Slides: http://www.dalaifelinto.com/ftp/bconf2012.pdf (16MB)
Main page: http://w3.impa.br/~zang/blenderconf
To better appreciate the panorama images you can download the following free panorama viewer apps for smart-phones:
Yummy! This was trending on Google Plus so I decided to give it a go. It’s very simple to do, it looks great and it tastes accordingly. I couldn’t find pre-cooked bacon so I had to pre-heat it before putting the eggs. It’s a bit tricky because I pre-heated them in the muffin molds already, so they started to loose their ’roundness’. Use chopsticks to get them out of the molds 😉
Some time ago Paul Bourke sent me some images he captured with the Red Scarlet and a 4.5mm lens. The result is really impressive. He can get a recording in crystal clear 4K at 30fps. Below you can see one of his images:
Ok, that’s not really his original capture. I wanted to explore how would be to insert virtual elements in a fisheye image. It shouldn’t be much different than integrating synthetic elements in a panorama (topic of some previous blog entries and a paper waiting for approval 😉 – more on that later this year ). And as it turned out, it’s ‘straightforward’-ish enough.
First take a look at the original image:
This is a cropped image, expanded vertically to fill the 180 FOV (field of view). This arrange of camera+lens+4k doesn’t give you a full frame fisheye nor a circular fisheye. As a curiosity, the Red Scarlet can get a complete 180 fisheye circle if the photo is made in 5k. However you can’t get a 30fps movie capture at that resolution.
Be aware that Hugin has a bug in the calculation of the focal length multiplier for equisolid fisheye lens (basically it’s using the full frame fisheye calculation for all its fisheye modes). Actually if you know someone involved in Hugin/Panotools project, I would send her/him over this patch. As far as I can tell the fix is along these lines. I couldn’t manage to compile Hugin though, so I don’t feel like sending a not-working patch for their tracker.
Back on topic … this is the image I got from Hugin (using 4.5 as lens and 2.35 as scale factor for equisolid – 2.35 was eyeballed because I couldn’t find in the internet the sensor size of the 4K capture for the Red Scarlet, and remember, once they fix the software the input would have to be different):
Once I got the full panorama the rest of a piece of cake. This scene is perfect for the IBL Toolkit (this square in the front plane is practically screaming “Calibrate with me !!11!!”).
And a render from a different angle (where glitches are expected). I used the Project UV option of IBL Toolkit to project the corresponding UV in the panorama to the subdivided meshes.
I’m planning to present a complete framework for working with panoramas and virtual elements in the Blender Conference this year. Even though this is based of my work with Aldo Zang (using Luxrender and not Blender) I think it can help to inspire possible solutions for Blender. So finger crossed for the presentation to be accepted and I hope we can make it interesting. The original paper (submitted to CLEI 2012 goes by the name:
Production framework for full panoramic scenes with photo-realistic augmented reality
So stay tuned, (and enjoy the Summer, Vancouver is finally sunny o/)
(thanks Paul Bourke for authorizing the re-use of his image, consider giving his website a visit, its one of these corners of the internet that will keep you busy for a long time)