Making of Clear Skies 3: Animation Perspective Part 2.

This post is going to quickly go over the two methods I used to actually get the animations into Source.

Method 1: BVH

Ipisoft exports to the BVH format, which is one of the standard motion capture formats. It stands for Biovision Hierarchy, and it was developed by Biovision, who went bust a while ago. This is the best format in my opinion, because it is very easy to read and parse.

A BVH file has two parts:

  1. the hierarchy itself, which describes the skeleton and initial pose.
  2. The transformations that the skeleton goes through to animate it.

The hierarchy starts with the ROOT node, which for humans is usually the centre of gravity. The hierarchy itself is recursive: each node contains data relevant to it (like position, rotation), and further subnodes.

The position of the node is defined by it’s offset from it’s parent, not an absolute position in the world. If it was absolute, it wouldn’t move when it’s parent moved. You’d move your arm, but your hand would remain on the desk.

The rotation of the node is defined as the rotation relative to it’s parent.

There exist a massive amount of tools for reading and manipulating this data, so I didn’t really need to write another one. It was a good job too – I was struggling with the XSI Mod tool.

XSI Softimage is the suite that Valve origianlly used for modelling & animating Half-Life 2. In 2004, when Half-Life 2 was released, XSI had a free, cut down version of the tool for Modders. It was called the XSI Mod Tool.

Valve also created a whole bunch of addons for it, to export stuff to the Source Engine.

Since Softimage got bought out by Autodesk, the free versions are less and less useful, the tools are out of date and features get locked off. You can find the old versions around, but it’s difficult and they are not supported much.

I’ll do a quick tutorial on it here anyway.

Autodesk Softimage Mod tool 7.5

You’ll need to find the ValveSource addons for Softimage, which are either on the net, or in the Source SDK folder.

Open the XSI Scene file “SDK_Male_reference.scn” which can be found in “C:\Program Files (x86)\Steam\steamapps\USERNAME\sourcesdk_content\hl2\modelsrc\humans_sdk\XSI_scn_files” or similiar.

You should get this chap:


He’s very low in IQ and poly count, but he’s lovable just the same, because he’s ideal for testing stuff out.

Press “8” on the keyboard to bring up the scene expolere, and middle click on the “Valve Biped” node.

Middle clicking recursivley selects everything inside the node.

Go to Animate->MOTOR->Tag Rig

This will bring up the tag rig dialogue. It defaults to a human biped, and has  entries for all the major bones.

You need to go through each diagloue, click the bone name button, eg “cog”, and select the corresponding bone on the human skeleton.

Here is an example:

I’ve chosen centre of gravity (cog) as Bip01_Pelvis. Most of them are easy enough to work out, but get it wrong and the animation will be….funky.

Once you’ve finished tagging the rig, SAVE it…it will save you a vast amount of time.

You then need your BVH.

There is a large library of free ones here.

Tagging the BVH rig

Go to Animate->MOTOR->Mocap to Rig

You’ll get this dialgoue:

You need to

  1.  Select BVH as the format
  2.  Select a BVH file
  3.  Create a tag template This will launch the same window as previosuly. This time, you want to tag the BVH you’ve just loaded. Once done, save this one too.
  4.  Select Target Model This should already be the ValveBiped, if not, select it from the Scene Explorer.
  5. Apply.

I’ve uploaded my Valve Biped tag file and by BVH tag file here, as well as a mocap file I took from the site mentioned earlier. Link:

Now, by the power of greyskull, you’ll animate the Valve Biped:

This is rendered from Softimage, because while making this post I couldn’t be bothered to export the skeletal animation into the Source Engine.

I’ll cover how to actually get this into game next post. This was meant to be quick, but oh well. I still need to cover the other method I used 🙂

Making of Clear Skies 3: Animation Perspective.

Clear Skies is a feature-length machinima series from Ian Chisolm, that won a load of awards. You can find them all here:

It’s filmed in the Source Engine, using the unbelievably powerful but oft-incomprehensible SDK, and Eve Online. All the indoor bits, involving people, are done with Source, and all the space scenes are filmed in Eve.

When the second part was done filming, I was introduced to Ian, who wasn’t sure about making a third, but he did mention the frustrations he was having with the limited animation set of the Source Engine.

I told him that I thought it was possible to create our own animations, even to use Motion Capture to do the hard work for us.

It even turned out to be possible to use completely markerless MoCap.

Shoot3d (now known as Ipisoft), have a markerless system that has a unique advantage: it can export directly to SMD, Source’s uncompiled model / skeletal animation format, making it much easier to import new animations into the game.

Before I tried my hand at doing my own Mocap, I took some BVH files and tried mapping them to a Source Engine Human Skeleton myself, and testing the results out in game.

This turned out to be a daunting task. I knew very little about skeletal animation, and getting the mappings right took some time. I had a small amount of help from, who gave me some pointers in getting the BVH working correctly.

My first success can be found here:

Once I’d got this process working, I tried out Ipisoft’s Mocap software.

For this I used:

  • An old mini-DV camera, with bad interlacing
  • A cluttered barn
  • Myself

Which was unfortunate, because they recommend using none of these things.

They recommend:

  • Multiple HD cameras
  • An uncluttered environment with little to nothing in the background
  • Somebody who is not me.

Undaunted, I tried anyway.

The chap visible in the video is Pope, who helped me out on the early tests. He was responsible for herding cats on Clear Skies 2.

We also made him do the robot:

IPISoft did a really good job of interpreting the image, and it’s results can be seen here:

Actually adding this stuff to a model in HL2 is not easy or straightfowards: in some cases you have to decompile the model, which breaks all kinds of stuff, most importantly the facial flexes.

Not ideal when you’re relying on the power of the Source Engine to bring the models to life.

Still, the earliest result I could find is here:

Not bad, for a beginning. More to follow, on the rest of the process.