From BlenderWiki

Jump to: navigation, search
Blender3D FreeTip.gif
IMPORTANT! Do not update this page!
We have moved the Blender User Manual to a new location. Please do not update this page, as it will be locked soon.

Page status (reviewing guidelines)

Proposed split
Proposed fixes: none

Motion Tracking


Motion tracking is a new technique available in Blender. It is still under development, and as of August 2013 supported basic operations for 2D motion tracking, 3D motion tracking, and camera solution. It's already ready to be used in production, as validated by "Tears of Steel."

Getting started

Motion tracking is included with the Blender 2.61 release and later versions. It's enabled by default for all platforms and can be used "out-of-the-box".

Here's brief descriptions of motion tracking tools currently available in Blender

Supervised 2D tracking

There's no common algorithm which can be used for all kinds of footage, feature points and their motions. Such algorithms can be constructed, but they'll be really slow and they can still fail, so the only way to perform 2D tracking is to manually choose the tracking algorithm and its settings. Current defaults work nicely for general footage which isn't very blurry and where feature points aren't getting highly deformed by perspective.

Improving 2D tracking is already in our TODO list, but it's not high priority at this moment. If you aren't sure about algorithms and settings and don't want to read this document, you can just play with settings and find one which works for you.

Manual lens calibration using grease pencil and/or grid

All cameras record distorted video. Nothing can be done about this because of the manner in which optical lenses work. For accurate camera motion, the exact value of the focal length and the "strength" of distortion are needed.

Currently, focal length can be automatically obtained only from the camera's settings or from the EXIF information -- there are no tools inside Blender which can estimate it. But there are some tools which can help to find approximate values to compensate for distortion. There are also fully manual tools where you can use a grid which is getting affected by distortion model and deformed cells defines straight lines in the footage. You can also use the grease pencil for this - just draw line which should be straight on the footage using poly line brush and adjust distortion values to make the grease pencil match lines on the footage.

To calibrate your camera more accurately, use the grid calibration tool from OpenCV. OpenCV is using the same distortion model, so it shouldn't be a problem.

Camera motion solving

Despite the fact that there's no difference in solving camera motion and object motion from a mathematical point of view, only camera solving is currently supported. And it still has some limitations, like unsupported solve of tripod motions or dominant plane motions (where all trackable features belong to one plane). These limitations are planned to be solved in the future.

Basic tools for scene orientation and stabilization

After solve, you need to orient the real scene in the 3D scene for more convenient compositing. There are tools to define the floor, the scene origin, and the X/Y axes to perform scene orientation.

If something is needed to stabilize video from the camera to make the final result looks nicer, 2D stabilization is available to help. It stabilizes video from the camera, which can compensate for camera jumps and tilt.

Basic nodes for compositing scene into real footage

Some new nodes were added to the Compositor to composite scene into footage in easier way. So there are nodes for 2D stabilization, distortion and undistortion which are easy to use.

Not implemented tools

Some tools aren't available in Blender yet, but they are in our TODO list. So there's currently no support for such things as rolling shutter filtering, object motion solving, motion capturing. But you can try to hack this stuff using currently implemented things.

Tools and properties for Motion Tracking

See more details about a tools and properties for motion tracking workflow here.

Blender3D FreeTip.gif
This is the old manual!
For the current 2.7x manual see

User Manual

World and Ambient Effects


World Background

Ambient Effects

Stars (2.69)

Motion Tracking


Motion Tracking
Motion Tracking Tools and Properties

Game Engine


Introduction to the Game Engine
Game Logic Screen Layout


Logic Properties and States
The Logic Editor


Introduction to Sensors
Sensor Editing
Common Options
-Actuator Sensor
-Always Sensor
-Collision Sensor
-Delay Sensor
-Joystick Sensor
-Keyboard Sensor
-Message Sensor
-Mouse Sensor
-Near Sensor
-Property Sensor
-Radar Sensor
-Random Sensor
-Ray Sensor
-Touch Sensor


Controller Editing
-AND Controller
-OR Controller
-NAND Controller
-NOR Controller
-XOR Controller
-XNOR Controller
-Expression Controller
-Python Controller


Actuator Editing
Common Options
-2D Filters Actuator
-Action Actuator
-Camera Actuator
-Constraint Actuator
-Edit Object Actuator
-Game Actuator
-Message Actuator
-Motion Actuator
-Parent Actuator
-Property Actuator
-Random Actuator
-Scene Actuator
-Sound Actuator
-State Actuator
-Steering Actuator
-Visibility Actuator

Game Properties

Property Editing

Game States



Camera Editing
Stereo Camera
Dome Camera




Material Physics
No Collision Object
Static Object
Dynamic Object
Rigid Body Object
Soft Body Object
Vehicle Controller
Sensor Object
Occluder Object

Path Finding

Navigation Mesh Modifier

Game Performance

Framerate and Profile
Level of Detail

Python API

Bullet physics


Standalone Player
Licensing of Blender Game

Android Support

Android Game development