BookCover

The book in a nutshell:

  • Pages: 448
  • Figures: 317
  • Files: 100
  • Release Date: June 20th, 2013
  • Publisher: Cengage Learning

Where to buy the book?

Where can I get the sample files?

Some people reported problems in finding the companion files. The official link is www.cengageptr.com/downloads, but if you want a direct links you can go here.

How to get a copy for evaluation?

  • If you are considering to adopt this material in a classroom, request a copy directly with Cengage.
  • If you want a preview of the chapter one, download it here. (or use the “Look Inside” option in Amazon.com)

What does the book cover?

The book is organized in the following chapters:

  1. Blender in a Nutshell
  2. First Game
  3. Logic Bricks
  4. Animation
  5. Graphics
  6. Physics
  7. Python Scripting
  8. Workflow and Optimization
  9. Publishing and Beyond
  10. Case Studies

We start covering the basics of Blender for new comers followed by a simple game project from start to end. That should give the reader the perspective of the components of the Blender game engine important for making your own projects.

The following chapters  are self-contained, each one with its own approach. Most of them have tutorials and short projects focused on the presented tools. Finally, chapter 10 gives the reader a perspective of real usage of the Blender game engine presenting 10 projects from different artists around the world explained by the authors themselves.

What version of Blender does this book cover?

The book covers Blender 2.66a fully.

Will there be an ebook?

Yes. So far online for Kindle (and in the USA), but other options and regions should be available shortly. For the Kindle version get it here.

About the authors:

Mike Pan is a CG generalist who started using Blender 10 years ago, before it was open sourced. Mike’s interest in Blender includes everything from special effects to compositing, and from real-time graphics to scripting. He has given talks at the Blender Conference in Amsterdam, hosted workshops at the View Conference in Turin and Blender Workshop in Vancouver, and conducted a three-day Blender course in Kerala, India. Mike is currently the lead programmer for a two-year project at Harvard Medical School to develop a biomolecular visualization software using Blender. Before that, he worked at the University of British Columbia with Dalai on a marine ecosystem visualization project. Mike lives in the always-raining Vancouver, Canada. You can find him at mikepan.com.

Dalai Felinto, who is currently living in Vancouver, Canada, was born in sunny Rio de Janeiro, Brazil. He has been using Blender since beginning his undergraduate studies in Architecture and Urban Planning in 2003. His participation in the Blender Community includes papers, workshops, and talks presented at events such as BlenderPRO in Brazil, Che Blender in Argentina, Blender Conference in Amsterdam, View Conference in Turin, BlenderVen in Venezuela, and Blender Workshop in Canada. He has contributed patches and code to Blender since version 2.47. Dalai uses Blender and the game engine in his work as a science communicator at the University of British Columbia, Canada. However, his day job doesn’t stop him from doing freelance Blender projects around the world. His latest works have been for Italy, England, and the Netherlands. Dalai’s dream career move is to land a job in the movie industry, working at Pixar. Follow his adventures at dalaifelinto.com.

Mike_Dalai

 

Bonus Question: How did this book help the game engine?

One of the reasons it took us almost three years to complete this project was to make sure it would cover all the game engine latest developments.

We were alarmed that some areas of the BGE (Blender Game Engine) could/should use a revamp. I’m one of the BGE developers, so we made the decision of taking the book writing and some needed development side-by-side. That includes a lot of changes in the UI, the support for unicode and ttf fonts, and the removal of texface properties in favour of per-material settings, and lots of bug fixes.

It wasn’t an easy call because we knew the book delay could harm the book sales. However my first concern was into making sure we were proud of the engine we were talking about. And even though we both make a living out of working with Blender and the game engine, we knew there were room for some pressing improvements.

We finally settled on the 2.66a. And before handing all the manuscript to the publisher, in the author review phase we made sure things were updated. From there on people can follow the release notes of new Blender and it will be fine.

The book companion files (over a hundred) also helped to test the BGE itself. Just to illustrate it, we were short in time prior to 2.66 release and couldn’t dedicate time to test the official release (somehow to finish a book takes time (: )). We then worked closely with other developers to have 2.66a stable as far as the animation, uv materials, and multiplatform support goes. The result? between 2.66 and 2.66a alone we had 15 bugs fixed by the bge developers (some by myself directly, and others by fellow programmers). And all the book files are working in 2.66a as they were originally conceived in all major platforms (Linux, Mac and Windows).

We hope you enjoy the book!

Feel free to drop us a line with any feedback or commentaries you may have.

bbc_01

bbc_02_pano

bbc_03Not only beautiful, but one of the few places in the world where you can bike around, enjoy the view and do some coding without any safety worries. I will certainly miss that.

Now, what an interesting timing. In Brazil riots are taken place for reasons I not only vouch for, but would love to take part on. If your local news is not covering any of those, do check the links below for more information.

As a curious contrast, in Vancouver the last riots were thanks to … Stanley Cup (hockey competition). Were they protesting against the abusive expenses in renewal stadiums, privatizing the public space, the lack of counterpart for the population? No my friends, the “protests” were due to the poor performance of the Canucks in the cup. What?!? Yup, go figure.

Meanwhile in Brazil:

For a “tinkering” developer there is no satisfaction like trying your code into production and have it working out of the box. I’ve been coding the 3-D stereo support for the multiview branch with no stereoscopic display, so today was the first time I could see it in action… and it works 😉

I tested top-bottom, side-by-side and interlaced (windowed and fullscreen). For interlaced windowed mode the “swap left-right images” is particularly important.

Caminandes - 3-D still courtesy of caminandes.com

Caminandes – 3-D still courtesy of caminandes.com

The one thing I didn’t test is the pageflip functionality. I came to the realization that my laptop doesn’t support 120Hz displays. I heard it’s working though, so I’m at ease.

I would like to express my gratitude to Dr. Maria Lantin, director of the Stereographic 3D Centre of Emily Carr university of art + design for so kindly open the doors and let me play with their “toys”. The same goes for her lab research team,  in particular Alan Goldman, Denise Quesnel and Sean Arden for taking the time to show their latest (extremely cool) projects and to help setting me up with the 3-D display and projector. And last and not least, for my friend Dr. Barry Po for connecting me with them (thanks Barry!).

For my own records: with the BenQ projector they are using a 3D-XL 3D Projector Adapter from Optoma to convert side-by-side or top-bottom inputs to time-sequential format.

And if you like the Llama, make sure to check the short movie Caminandes.

Related Links:

I got a bit tired of the back end coding for the Multi-View branch and decided to tackle the frontend for a change.

Anaglyph

Anaglyph Mode for 3-D (Multi-View) – Los Padres OpenEXR [link]

Stereo Display Options

Stereo Display Options

Blender has a very modern drawing system (nicknamed Triple Buffer) which takes control over the buffer swapping routines (instead of relying on the graphic card Front/Back Buffer handling). That allows Blender to redraw the UI really efficiently. That also made the front-end implementation a breeze.

Now in User Preferences you can set the 3-D display you will be using with Blender. At some point I may make it a per-window option, but for now it will affect all the opened windows.

Next thing you need is a 3-D (Multi-View) image. You can simply render your own images (make sure the RenderViews are named “left” and “right”) or download a few OpenEXR samples.

3-D View

 

With no 3-D display set, when you open a 3-D image you should see the views in the Image Editor drop-down. When any 3-D display is set, however, you will see a new “3-D” option. Once this is set, you can take full advantage of your 3D-gear.

 

 

The following are samples from the other current display options. Be aware that the image I’m using doesn’t converge in a nice stereo 3-D photo. It’s in fact intentionally produced to show very different images, to make sure the code is working (programmers, go figure).

Side-by-Side

Side-by-Side

Top-Bottom

Top-Bottom

Interlaced

Interlaced

 

 

 

 

 

 

And not that you asked, but this was a great weekend for my 3-D philia. Iron Man 3 was a nice movie, and yesterday I attended two seminars at the SID – Display Week 2013 which turned to be quite inspiring talks with the addition of seeing some jaw-drop 3-D displays. I’m actually going there again tomorrow for the exhibit booth to see if I can clarify some pending questions that I have. I guess I should thank Queen Victoria for the long weekend 😉

Related Links:

I’ve been writing too much about coding, programming and other cool boring topics 😉

For a change I decided to share my current “desktop image”. This was part of the visualization I built for the AAAS 2013, for the Nereus Program sections.

aquaculture

The curves, spheres and the conformance of the visuals to an invisible translated-sinoide were all done with a Python addon I wrote for Blender. I will write more about this project one day.

The slightly yellow background (in oppose to pure white) is thanks to the great book: Visual Strategies: A Practical Guide to Graphics for Scientists and Engineers” by Felice C. Frankel and Angela H. Depace.

This video showcases the current snapshot of the multiview branch I’ve been working on.

Source Code: http://github.com/dfelinto/blender/tree/multiview

Original Proposal: http://wiki.blender.org/index.php/User:Dfelinto/Stereoscopy

For follow ups in the development I will keep posting in the bf-committers mailing list. But I will try to keep my blog up to date as well.

If you like my desktop background, this is the cover of my upcoming “Game Development with Blender” book with Mike Pan. The book is its final revision stage (checking the final pdfs about to be printed) and should be shipped soon. The pre-sale campaign on Amazon is still on-going.

Have a good day!

Dalai

Related Links:

What if we could work in a stereoscopic 3d animation and wanted to preview the work in the viewport? Hopefully that will soon be possible in Blender (in my version of Blender at least :p ).

Click to see the complete interface

Click to see the complete interface – model courtesy of patazstudio.com

After finishing the dome-stereo support (I got it all working now ! 😉 ) in Cycles, I decided to investigate how would be to support stereo in perspective (non-fisheye) mode. The full disclaimer will come once things are in a better shape, but I gotta say, it’s fun coding.

One thing I’m trying to figure out is what should be builtin in the Blender C code, and what should be implemented at an addon level. People have been doing stereo renders one way or another, so I’m confident that as long as we provide the bare-bones of stereoscopic support, they will be happy. In the links below there is actually a really nice addon for Blender 2.6.

mirrored test - BMW model courtesy of mikepan.com

Cycles Stereo 3D Render – Mirrored Display – BMW model courtesy of mikepan.com

I have mixed feelings about stereo-3d movies. But in the last Siggraph Asia I attended the most spectacular workshop on “Constructing a Stereo Pipeline from Sratch” by Disney Stereographer Vladimir Sierra. This changed the way I see 3d movies, and reinforce to me the importance of attending those conferences whenever possible. Even nicer when you go to present a project 🙂

That said, I never worked in a stereo-3d production and I don’t want to limit the possibilities here by my experience. To help with that I’m counting on 3D artist Francesco Siddi to help with designing a nice workflow proposal.

Coming up next: a crowd-funding to get me a real 3D Display :p
(kidding, though I wouldn’t mind)

My own reference links:

  • NVidia presentation on Siggraph 2011 on Stereoscopy (pdf)
  • 3D Movie Making by Bernard Mendiburu (book)
  • Cinema Stereoscopico by Francesco Siddi (book – Italian)
  • Blender 2.6 Stereoscopic Rendering Addon by Sebastian Schneider (link)
  • 3D Fulldome – Teaser (link)

Disclaimer: This tutorial is tailored towards the cluster system I have access to. The output from the commands is unique to each system, and the same goes for the cluster commands. Also, since OpenCL is still not fully supported in Blender this guide focus on CUDA rendering only. Anyways, use it wisely, and refer to your own cluster documentation for their equivalent of qsub, interactive testing, nvidia-smi, … feel free to drop a line if that helps you or if you have any complementing information.

Introduction

Blender has no command-line way to set the render device Cycles should use. That means you may not be able to take full advantage of your GPU renderfarm out of the box.

This is not a problem if you use the same computer to save the file and to render it. However if you need to render your file in a remote computer (in this case a renderfarm) you may not have access to the same hardware available in the target computer. So you can’t rely on the ‘render device’ settings that are saved with your .blend.

Luckily we can use Python to set the proper rendering settings before producing your image.

Getting the list of GPUS

The ssh terminal I use for login doesn’t have access to any GPU, thus it doesn’t work for testing. Taking advantage of the qsub system available in the cluster, the first step is to ask for an interactive session to get a list of the available GPUs. In a ssh session in your cluster do:

$ qsub -q interactive -I -V -l walltime=01:00:00,nodes=1:gpus=3
qsub: waiting for job 125229.parallel-admin to start
qsub: job 125229.parallel-admin ready

At this point the shell terminal is back and we can ask Blender what devices it can see.
First copy the code below and paste in a new file called available_devices.py:

import bpy, _cycles
print(_cycles.available_devices())
bpy.context.user_preferences.system.compute_device = 'BLABLABLA'

To use this script in Blender, run:
$blender -b -P available_device.py

Here this will return the following list: (‘Intel Xeon CPU’, ‘Tesla M2070’, ‘Tesla M2070’, ‘Tesla M2070’, ‘Tesla M2070 (3x)’)

And also the following intentional error:
TypeError: bpy_struct: item.attr = val: enum “BLABLABLA” not found in (‘CUDA_0’, ‘CUDA_1’, ‘CUDA_2’, ‘CUDA_MULTI_2’)

This “error” is only to let us know what is the internal name of the CUDA device I want to use. In my case I want to render with the ‘Tesla M2070 (3x)’ so I should set system.compute_device as ‘CUDA_MULTI_2’ (the error list and the output list have the same size and order, so it’s a one-to-one correlation).

Rendering

Now  to create the actual setup script. Paste the following code in a new file called cuda_setup.py (replace CUDA_MULTI_2 by the device you want to use in your system):

import bpy
bpy.context.user_preferences.system.compute_device_type = 'CUDA'
bpy.context.user_preferences.system.compute_device = 'CUDA_MULTI_2'

And to use this script in Blender and test render your file:

$ blender -b Nave.blend  -o //nave_###.png -P ~/cuda_setup.py -f 1

Remember, the order of the arguments matter. Any argument added after the -f will be parsed only after the render is over. For the complete list of available arguments visit the Blender Wiki.

For the final render you need to use the above command as part of a qsub job dispatching file (the -P cuda_setup.py part). Since Cycles doesn’t recognize all the cluster nodes as rendering devices, you need to split your job into batches, to have an instance of Blender running on every node. This is outside the scope of this guide though.

Just to be Sure

In case you think the GPUs may not be at use, you can do the following. First ssh connect in the interactive node you were using for the test. Next use the NVidia SMI program:

$ nvidia-smi

gpus

As you can see there is no GPU power been spared here. Sorry but you can’t bitcoin mine here 😉

Wrapping Up

This tutorial is intended for my own records, but also to help someone stuck in the same situation – though you will likely have to adjust something to your own setup. I tested this with Blender 2.65a running on a CentOS in a cluster based on the HP Proliant SL390 server architecture.

Happy 2013!

 

Hello all,
I’m pleased to announce that the latest version of Blender 3D is out. This is the collaborative effort of a team of developers, which I’m proudly a part of. The 2.65 edition is particularly relevant for the dome community due to a complete support for equidistant fisheye lens rendering.

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

Equidistant fisheye 180°, used for fulldomes. Image by Adriano Oliveira

That includes a series of fixes since last release, the most noticeable been the Equidistant Fisheye Lens fix, as I mentioned in an early post. This release not only benefit Fulldome artists, but also anyone willing to experiment with the Equisolid Fisheye lens. The image below is what you see from within the working viewport.

And how simple it is to use this? For those familiar with the builtin Cycles render engine this is as easy as it gets. For Equidistant fisheye lens all you need to do is to set the render dimensions to square (e.g., 1024 x 1024) and to set the field of view angle (usually 180°). For Equisolid fisheye lens you need to do is to set the lens size and one of the sensor dimensions. The other sensor dimension is taken from the render aspect ratio.

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio - www.patazstudio.com

Equisolid fisheye, 3d viewport preview. Image by Pataz Studio – www.patazstudio.com

For the complete release information, please visit the official Blender 2.65 Release Log.

For the Fisheye Lens specific info, check:

Blender is under 60MB to download, free and more than capable to handle small to medium size productions. Go get it! I hope it can help more future fulldome productions in the future.

Enjoy it,
Dalai

Err not really, backup often kids ;). To add it up to my recent tragic Windows 8 destroyed my OSX partitions story I actually have ordered an external hard drive but it arrived one day after my HD was messed up.

Now time for some good news: I just fully recovered my HD. I would like to thank all the help I got over here, G+, email. Really, thanks big time.

So that’s what I did (in case you are here googling about the same problem). I’m assuming that if you got that far in the internet for help, you are already as doomed as I was. So I won’t even bother with warning and re-warning you that you are better off seeking professional help.

1) 2012 is the year of the penguin, but not. But still, Linux still rocks. I have no Mac-to-Mac cable at hand so I had to resort to a rescue live CD. The party wouldn’t be complete without an OS-threesome after all.

So I first tried Ubuntu Rescue Mix but it didn’t come with the application that I was suggested to try (namely gdisk), so I used it only for diagnostic and the dump backup. In the end I downloaded GParted which comes with a nice X11 btw (though you will inevitably use the console, so don’t get pumped it up ;))

2) dd if=/dev/sda /media/usb7/sda_backup
First things first, jokes aside it’s never too late for a backup. I started with a byte backup of my 500GB. It takes time, as in, really. 30 hours later I knew there was nothing I could do to make things worse. If anything I could reverse the dd if/of order and restore my HD to the broken state I was getting familiar with. And there is nothing like Mario U to endure 30 more hours if it comes to that 😉

*) As for everyone’s suggestions I tried gdisk, but my system was too corrupted. It had no internal backup (what you get from using ‘c’ from gdisk) and the error message I got (something about overlapping partition sectors) was stopping me from doing any change in the partition and write it back. I thought it could be a conflict between the MBR Windows tried to write and the GTP created by Mac, so here for a more radical next step:

3) dd if=/dev/zero of=/dev/sda bs=512 count=1
WARNING: This will clean up the first 512 bytes of your partition, namely the boot sector. Note, this may not be necessary, but I was already in the verge of resetting my system. I went back to gdisk to see if that worked, and in fact the MBR was gone. I quitted gdisk because it wasn’t his turn anymore. Welcome the star of the night:

4) It’s fdisk magic time 🙂 I don’t know what to say. A few [enter]s and I had my HD back up and running. This console based program (that comes with GParted, OSX, and any Linux as far as I know) is great. It prints the partition types correctly, list all the files in the Windows partition, and even allowed me to copy them out if needed. Since my Windows had nothing special I turned to the OSX partition and, although file listing isn’t supported, just asking it to fix it solved the problem. I’m not kidding, it took me less than 5 minutes.

Final considerations: I just wished the personal at AppleCare phone line could have drove me through that. They pretty much got me hopeless to the point that I didn’t even bother dropping my computer at an AppleStore.  If I had the time I would get my broken backup back (from [2]) to the laptop to see if the Apple Store crew would have fixed it :p

And that’s all. I hope this turns useful to someone. And again, thanks everyone for the help I got. Very much appreciated. And yes this post has tons of smiles. That looks like me through the entire day 😀