Making of Clear Skies 3: Animation Perspective.

Clear Skies is a feature-length machinima series from Ian Chisolm, that won a load of awards. You can find them all here:

It’s filmed in the Source Engine, using the unbelievably powerful but oft-incomprehensible SDK, and Eve Online. All the indoor bits, involving people, are done with Source, and all the space scenes are filmed in Eve.

When the second part was done filming, I was introduced to Ian, who wasn’t sure about making a third, but he did mention the frustrations he was having with the limited animation set of the Source Engine.

I told him that I thought it was possible to create our own animations, even to use Motion Capture to do the hard work for us.

It even turned out to be possible to use completely markerless MoCap.

Shoot3d (now known as Ipisoft), have a markerless system that has a unique advantage: it can export directly to SMD, Source’s uncompiled model / skeletal animation format, making it much easier to import new animations into the game.

Before I tried my hand at doing my own Mocap, I took some BVH files and tried mapping them to a Source Engine Human Skeleton myself, and testing the results out in game.

This turned out to be a daunting task. I knew very little about skeletal animation, and getting the mappings right took some time. I had a small amount of help from, who gave me some pointers in getting the BVH working correctly.

My first success can be found here:

Once I’d got this process working, I tried out Ipisoft’s Mocap software.

For this I used:

  • An old mini-DV camera, with bad interlacing
  • A cluttered barn
  • Myself

Which was unfortunate, because they recommend using none of these things.

They recommend:

  • Multiple HD cameras
  • An uncluttered environment with little to nothing in the background
  • Somebody who is not me.

Undaunted, I tried anyway.

The chap visible in the video is Pope, who helped me out on the early tests. He was responsible for herding cats on Clear Skies 2.

We also made him do the robot:

IPISoft did a really good job of interpreting the image, and it’s results can be seen here:

Actually adding this stuff to a model in HL2 is not easy or straightfowards: in some cases you have to decompile the model, which breaks all kinds of stuff, most importantly the facial flexes.

Not ideal when you’re relying on the power of the Source Engine to bring the models to life.

Still, the earliest result I could find is here:

Not bad, for a beginning. More to follow, on the rest of the process.




Leaving Tenerife

Finished my diving trip now, and I’m flying back to sunny Manchester tomorrow afternoon.

I’ve managed to do 7 dives this week, which is three less than I should have done. I had to can a dive this morning because of a equipment malfunction, and I missed a day due to blocked sinuses.

The dives I have done, however, have been very fun. I’ll post some pictures and a more complete log of what I’ve been doing soon.

-Tenerife Scuba, fueled by caffeine.


Made it to Tenerife, and learned a few things along the way:

  1. Don’t forget to check in online when flying ryanair. They WILL charge you for your foolishness.
  2. Get travel insurance BEFORE leaving the country. Most travel insurers will not offer diving insurance on its own. I solved this by signing up with Dive Assure, on  a seven day policy. We’ll see how this pans out.
  3. Don’t dive with the remnants of a cold. No matter how much you think you’re better. You’re not, and your sinuses will punish you for it. Idiot.

I might get to go on a night dive this week. Excited and nervous about it. I might get a cheap underwater camera at some point, and stress test it.