My love for photomatching goes a long way.
Back in 2007 I did this project using the fantastic SketchUp Photo Match:

I used 20 photographies, a blueprint of a cross section and a blueprint of the original floor design. I was then hired to do the drawing of the façade with AutoCAD to be used for a study on preservation and historical register of this building.

Since then I realized that Blender was very far from catching up with tools designed with architects in mind.
Today I ran into an add-on for Blender that may help to reduce this gap.

BLAM is a Blender Calibration Tool that you can find here:

My original plan for tonight (to code support for green-magenta anaglyph glasses in the Blender Game Engine :)) clearly would have to wait. It’s time to test the tool!

I was following the steps of the video tutorial – BLAM Video Tutorial
If you want to try yourself this is the picture I used:
University of Seattle
It’s a picture from the University of Seattle. I traveled to Seattle last year and really enjoyed the university campus (and the BattleStar Galactica exhibition at the Space Needle alone made the trip worthwhile).


  1. adding axis is nice and intuitive but it would nice to tweak the curves for fine tuning while seeing the 3d change (as a live ‘estimate camera focal length and orientation’ mode)
  2. an option to automatically add the image as image background would be nice.


My workflow:

  • look at the picture, find where I want the origin to be and moved the camera until this point was in the scene origin [0,0,0]
  • add a mesh, collapse all vertices (so we end with a vertice in the origin only).
  • extrude this vertice in the Z axis
  • change Pivot Point to 3D Cursor
  • select camera and rotate it until a vertical line matches the reference line I created on (c).
Some comments:
  1. It would be nice to have a Transform Orientation that follows the ‘view distortion’.
  2. Actually all axis could be visible in real-time in the 3dview (drawn with bgl – the OpenGL wrapper of Blender) perhaps even clickable, completely removing the need for the uvview step.
  3. Also important would be a way to quick apply camera image to selected object (or selected face).
  4. I sketched adding a geometry to match the building but my tests ended here. Why? Because my picture clearly has way too much distortion for this workflow.

In this file you will also find the 3d geometry with the uv project and subdivide modifiers (before and after applied them) – grab it here (600kb)

Grease Pencil over 3d background image


I liked what I saw. It looks like the development is going in a good direction.

The reconstruction part (which it’s the more important but still) doesn’t seem to work nice with high distorted pictures. I like how the current solution dialogs with Blender tools (e.g. Grease Pencil).

edit: it seems I helped to spot a bug with vertical oriented images. See the comment from BLAM developer here. I’m yet to re-test the new version of the add-on.

Even though I’m biased towards SketchUp I can find myself used to this new workflow. It will be nice to see if we can have a solution that works well for multiple cameras and a quick integration with camera projection functionalities.


Dalai Felinto

4 Thoughts on “BLAM – Photomatching in Blender

  1. It’s nice having this available in blender! Even with the camera tracking already in place

  2. wewe on May 11, 2012 at 5:08 am said:

    I use this version:

    Help!Error~~!This plug out of an AttributeError: ‘the object’ the Mesh has the no attribute ‘the faces’ error
    The Google solution is

    Request an update~~!Please~!

  3. admin on May 18, 2012 at 11:05 pm said:

    Hi Wewe,
    I’m not a maintainer of the BLAM project. Have you tried to contact the developer from the BlenderArtists forum thread?

  4. Oh!Yes, I am also of the projects under the message, Thank you for your reply ~!

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation