Welcome, I’m your Oracle. What would you like to know?

Those promising first lines announce what is to come. A virtual avatar who will walk you through a journey of knowledge and discovery.

This was our first take, a prototype if you will, in creating ways to communicate global data on the ocean possible futures and the scientific models underneath the predictions. As part of the NF-UBC Nereus Program.

If you wanna learn more about the science underneath the project, and the whereabouts, go check the official press-release:
http://www.publicaffairs.ubc.ca/2012/01/03/the-oracle-meets-nereus-predicting-the-future-ocean/

 

What is the Oracle?

“We use a 3D gaming engine as an interface for the scientific models in order to present a science-based view of how the oceans once looked, how they look now, and how they may look in the future depending on our actions.” V. Christensen

This interactive presentation is part of a few years of research and some trial and error on better ways to communicate our science. We realized that a very powerful way of  communicating quantitative differences in scenarios was through a split screen and a continuous camera. The principle is based on having a constant baseline we can compare the simulated future scenario against.

For this installation we have a single yes/no question (“do you want to limit fishing for top predators?”) and thus two possible outcomes. We see this as an open end  framework we can feed with new questions and visualizations according to the topics of the time. At some point we can use the globe to explore the geographic aspects of the issues as well.

Or we may change everything and start from scratch. It’s really hard to tell at this point, but so far the response has been quite positive.
A screen capture of the whole interaction can be seen on youtube:

 

What is going on under the hood

Blender – an open source 3d tool – was used to produce all the graphic elements of this application. The application is built on top of the Blender Game Engine – a 3d game engine that is part of Blender. All the models and animations were made in Blender and the same goes for the videos.


Animations

Oscar Baechler, an artist from Seattle, joined us for two weeks to work in the oracle. His role was character concept, modelling, texturing, rigging and animation. It was quite a relief to have him on the team while I could focus on the programming, the rest of the artwork, videos …

The animation workflow was built on top of Papagayo. This is a standalone software that helps converting recorded dialogs into animated shapes over time. It works quite well for lip-sync. The original add-on for Blender was designed to control shape keys. This was a bit overkill and doesn’t work in the game engine. Thus for this project I had to change it to apply the transformations straight to hardcoded bone channels. Part of the patch was already committed back to Blender. Soon Oscar and I will create documentation for this workflow and share a sample character ready to use.

For cut scenes in the game engine I controlled the playback of the animations (for the camera and the oracle) through an Action Actuator in the Property Mode. The property used (frame) was updated with the current position of the dialog being played. The audio can be played using the python module audaspace or with a sound actuator. The key here is to use the current time of the audio multiplied by the animation frame rate: object[“frame”] = time * 30.0

Look at me

In order to have the oracle looking at the camera, she  has a neck bone, set in a Damped Track. For every animation we set a different influence factor according to how much we want she to look straight to us. This is/was not currently supported in the game engine and the same goes for controlling the influence of the constraint. Despite all the rush we could afford to allocate some time for development. After a few copy and pastes I expanded the Armature Actuator to support Set Influence and the Damped Track. The patch with a test build is waiting for peer review here.

Click on the image to see the complete Logic Brick setup of the oracle.

Videos

The videos have a clever setup. The three scenarios (present/future yes/future no) are in fact one single file. I’m using python to switch the current scenario through a simple button in the interface (we got to love Blender Python capabilities). By doing so, the multiple particle systems (i.e. reef and small fish) have their population changed accordingly to the scientific data feeding the visuals. Also the camera border is set (so we render only what we need) and the colour of the water and other post processing settings (i.e. composite nodes) change as well. For the big fish and the turtle we are simply hiding the ones we don’t need for the current scenario.

This almost works out of the box. We had, however, to patch Blender to force the particle system to render exactly what you see in the screen (usually a small amount of the particles to avoid overloading the graphic card during the pre-render work). We could simply have different particle systems with different population numbers. The problem is that once you change one single particle all the arrange turns out different. So what we needed was simply to hide more or less of one specific arrange of a particle system.

We have been using this patch for quite some time now, but I’m yet to work with Janne Karhu (Blender particle developer) to push this into trunk.

Future Yes: more big fish, less small fish, turbidity 1.0

Future No: less big fish, more small fish, turbidity 0.2

If you want to see the videos individually, please visit:

Blender on the news

I was to post a  video on CBC Canada covering the event and airing a good part of the oracle presentation. However the video is no longer available. I wrote to the news network to see if I can get hold of it to re-share here. In the mean time I gathered some links 

http://www.publicaffairs.ubc.ca/2012/02/18/window-into-worlds-future-oceans-unveiled-by-nf-ubc-nereus-team/

http://www.publicaffairs.ubc.ca/2012/01/03/the-oracle-meets-nereus-predicting-the-future-ocean/

http://planetsave.com/2012/02/18/the-nereus-project-predicting-the-future-of-the-worlds-oceans-video/

 

Credits:

The Oracle is a NF-UBC Nereus Program product, designed to inform about management topics that are important for the world ocean.

Concept and Direction: Villy Christensen
Production: Jeroen Steenbeek and Dalai Felinto
Sound: Jeroen Steenbeek
Narration: Rhona Govender
Animation: Dalai Felinto and Oscar Baechler

The Nereus – Predicting the Future Ocean program is a scientific cooperation between The Nippon Foundation and The University of British Columbia.

NF-UBC Nereus Program
Director: Villy Christensen
Co-Director: Yoshitaka Ota

We thank the Global Shark Conservation Campaign of The Pew Environment Group for cooperation on production of the animated shark videos.

Scientific Advisor: Villy Christensen
Production: Jeroen Steenbeek
Visualization: Dalai Felinto and Mike Pan

AAAS, February 2012

 

The destiny of our oceans is in your hands …

Dalai Felinto
NF-UBC Visualization Expert
3D Artist / Software Architect

[if you have any questions or comments, please express yourself]

3 Thoughts on “Meet your Oracle

  1. Pingback: Meet your Oracle – Scientific Visualization | BlenderNation

  2. Thanks for this post! Great to see your work and for such an interesting venue.
    Loved the UI and the anims; would like to further examine your GameLogic boards for the BGE (blender game engine).
    For the web, would this be better to be released in Unity (browser plug-in download req’d)? Please compare/contrast your thoughts.

    • Thanks. I’m glad you liked it. The idea is to deploy to the platform that better fit the project at the time of release. Whether this is BGE, Unity or even pure html5 is still early to say.

      But it’s nice to use BGE for at least the prototyping phase. It worked pretty well.

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation