Raytracer update

When I started this blog, it was intended to be a way to do some write ups of what I’m doing, and to easily share some of the cool stuff with my friends. I don’t like just sending images and builds over skype file transfer, and I like a record. Unfortunatly, my laziness gets in the way a lot.

I’ve made massive amounts of progress with the Ray Tracer, but not put up any images here. Here’s one:

Stanford bunny and dragon. Render time: 60 seconds.

This image here is the payoff. I’ve implemented reflections, OBJ loading, KD trees, diffuse lighting. There’s still a lot to do, though.

The KDtree is what makes all the difference. The Dragon is a very, very expensive model, containing 871414 polygons.

Without any kind of acceleration structure, it takes well over 24 hours to render. (I tried, as a test, to see how long, but gave up after a day.).

Unfortunately, I won’t be able to get much more speed out of Java.

I ported the Raytracer to Java in the first place because I needed to do a complete rebuild, and I’m better at rapid prototyping in Java than I am in C++. It would have been a better learning exercise to do it in C++, but I’m frequently impatient, and I wanted to see cool stuff without having to fumble too much with the language.

Now, I’m probably going to port it back at some point, and eventually use OpenCL to get my GPU to do more of the hard work, but in the meantime there are some more things I’d like to try out.

Partial list:

  • Path tracing.
  • Distributed ray tracing.
  • Fresnel equations (yeah, yeah, this should already have been done.)
  • Programmable camera, so I can do flyby videos.

I’ve already started work on the Path Tracing, with mixed results.

Perlin Noise

Shamus Young was messing around with Perlin Noise, which turned out to be normal noise. I dug around in an old VM to find this image:

Blue and white marble effect. Sort of.

The massive sphere in the background is procedurally textured with the Perlin Noise function, that is described in this paper: http://mrl.nyu.edu/~perlin/paper445.pdf

It’s not the best example I ever produced, but it’s the only build of my old raytracer that I could find. Shame on me for poor version control.

The source code for it, in Java, is on Ken Perlin’s website here: http://mrl.nyu.edu/~perlin/noise/

To be honest, it’s a very long time since I’ve looked at that stage of my ray tracer. I ditched the procedural generation of textures because they took way too long, but the basic idea is that you use the “noise” function, which takes X Y Z coords as arguements, and it returns the “next” noise point from the initial permutation list. Or something. These days I comment my own code a lot better, precisely because I can’t remember how this stuff works anymore.

When you’ve got the returned noise amount, you can do cool shit with it. I found a couple of functions around that could turn it into a marble style texture, but you can do a lot more with it.

On Digital Distribution

I hated Steam. Note the tense – I used to hate Steam, and I used to be a pirate.

I’ll explain.

Back when Half-Life 2 came out, I lived with my parents. I was still at school, and we lived on a farm in the middle of nowhere, in rural Wales.

It was a 4o minute bus journey to school in the mornings. (It could be worse – the fact that I got a bus at all was amazing).

We lived in a ADSL black spot. Inititally, there was a distance cap on broadband – if you lived too far from the telephone exchange, you couldnt get it. I think the max was two miles. We lived around seven miles from the exchange.

The next problem was the exchange itself – it was the only one of it’s kind in the country. Back in the eighties, or thereabouts, BT bought a single exchange from some American telephony company, to try out.

For whatever reason, they put it in rural Wales, where traffic low, and many houses didn’t even bother with a phone – they’d trek a mile or two to a phone box.

This thing, this obsolete and foreign monstrosity, was not compatible with DSL. The local loop could not be un-bundled, for reasons that I was never clear on.

Finally, we had a DAC on our line. A DAC was (is? I seriously doubt there are any still around) a money saving device for BT. If BT couldn’t be arsed to run a new line to a new house, or an old house that was only just being connected up, they’d but a digital to analogue converter on the line, effectively turning it into two lines.

This meant two things.

  1. We could only get half speed dial up internet. 26.4 kb/s. That was all we got. Browsing the internet became an exercise in patience.
  2. It made ADSL impossible. BT would have to install a new line.

Now that I’ve set the stage, I’ll explain my early experiences with Steam.

Steam, and biscuits.

I’d been waiting for Half-Life 2 since it was featured in PC Gamer, around two years previosly.

I’d built a PC from scratch, solely for this game. ( I had 1 GB ram. WOW! ).

I’d preorded the special edition, which came in a biscuit tin and had a t-shirt with it.

Of course, I already knew about Steam, and I knew I’d never be able to download the game from there. I’d assumed that buying the boxed copy meant that I could just activate it on Steam, then I’d be on my merry way.

Yeah. Right.

I didn’t quiet get the game on release. I think I got it a day later, or something.

There had already been a minor patch, a measly 4 meg or something. Damned if I can remember. The point was, that after steam had finished unlocking the game, it wanted to download this patch.

I’m sure you remember the earliest days of steam. They had server outages, weird connection bugs, and the awesome “This game cannot be started”.

All of these problems were multiplied my awful internet. Eventually, what I did was pack up my PC, and persuade my Dad to cart me and the computer down to a friends house, who had ADSL.

The cycle went like this:

  1.  Go to friends house
  2. Download HL2 updates.
  3. Return home.
  4. Start steam, without dialing the internet.
  5. Get “Steam could not be started in offline mode” error.
  6. Go online, just to let it connect.
  7. Find out there has been an update between me leaving my friend’s and getting home.
  8. Goto 1.

Of course, it wasn’t this bad. It just seems that way. Steam’s offline mode was and is a bit naff – even if you’d tried to force offline mode, it still might refuse.

Eventually, I gave in. I started to browse the depths of some unsavoury forums, to try and find a no-cd type fix, that would cut out steam entirely.

When I found one, it was glorious. I could enjoy my game in peace, and none of this idiotic always on DRM to get in my way. I thought I’d never use Steam again – after all, it only had one game I cared about, and I’d cracked that.


Sore Feet

I’ve just done the Bogle, which is a 55 mile walk around Greater Manchester. I completed it in 24 hours and 30 minutes, and now I ache. Everywhere. It’d be easier to list the places that DON’T hurt, so I’ll do that:

  • My eyebrows.

At times my pace slowed to around 1MPH. That was awful. Other times, I went on a massive adrenaline and sugar high. That was excellent.

Here’s the route:

The route consisted of two loops, both starting from the North Campus Student’s Union. The south loop started at Friday 2nd March at 7pm, and we were back there by around 7am the next day. Then the northern loop, where most people drop out.

That was one of the most physically and mentally strenuous things I’ve ever done. It would have been so easy to stop at any point and ring the support team, who would have whisked me away to a place with no pain…however, I appear to be a masochist.

I’ve tried this twice before, and failed it both times, so now that I’ve done it I don’t really feel inclined to do it again.

Oh, and this is me just over half way ’round:

Yes, three cups of tea.

Making of Clear Skies 3: Animation Perspective Part 2.

This post is going to quickly go over the two methods I used to actually get the animations into Source.

Method 1: BVH

Ipisoft exports to the BVH format, which is one of the standard motion capture formats. It stands for Biovision Hierarchy, and it was developed by Biovision, who went bust a while ago. This is the best format in my opinion, because it is very easy to read and parse.

A BVH file has two parts:

  1. the hierarchy itself, which describes the skeleton and initial pose.
  2. The transformations that the skeleton goes through to animate it.

The hierarchy starts with the ROOT node, which for humans is usually the centre of gravity. The hierarchy itself is recursive: each node contains data relevant to it (like position, rotation), and further subnodes.

The position of the node is defined by it’s offset from it’s parent, not an absolute position in the world. If it was absolute, it wouldn’t move when it’s parent moved. You’d move your arm, but your hand would remain on the desk.

The rotation of the node is defined as the rotation relative to it’s parent.

There exist a massive amount of tools for reading and manipulating this data, so I didn’t really need to write another one. It was a good job too – I was struggling with the XSI Mod tool.

XSI Softimage is the suite that Valve origianlly used for modelling & animating Half-Life 2. In 2004, when Half-Life 2 was released, XSI had a free, cut down version of the tool for Modders. It was called the XSI Mod Tool.

Valve also created a whole bunch of addons for it, to export stuff to the Source Engine.

Since Softimage got bought out by Autodesk, the free versions are less and less useful, the tools are out of date and features get locked off. You can find the old versions around, but it’s difficult and they are not supported much.

I’ll do a quick tutorial on it here anyway.

Autodesk Softimage Mod tool 7.5

You’ll need to find the ValveSource addons for Softimage, which are either on the net, or in the Source SDK folder.

Open the XSI Scene file “SDK_Male_reference.scn” which can be found in “C:\Program Files (x86)\Steam\steamapps\USERNAME\sourcesdk_content\hl2\modelsrc\humans_sdk\XSI_scn_files” or similiar.

You should get this chap:


He’s very low in IQ and poly count, but he’s lovable just the same, because he’s ideal for testing stuff out.

Press “8” on the keyboard to bring up the scene expolere, and middle click on the “Valve Biped” node.

Middle clicking recursivley selects everything inside the node.

Go to Animate->MOTOR->Tag Rig

This will bring up the tag rig dialogue. It defaults to a human biped, and has  entries for all the major bones.

You need to go through each diagloue, click the bone name button, eg “cog”, and select the corresponding bone on the human skeleton.

Here is an example:

I’ve chosen centre of gravity (cog) as Bip01_Pelvis. Most of them are easy enough to work out, but get it wrong and the animation will be….funky.

Once you’ve finished tagging the rig, SAVE it…it will save you a vast amount of time.

You then need your BVH.

There is a large library of free ones here.

Tagging the BVH rig

Go to Animate->MOTOR->Mocap to Rig

You’ll get this dialgoue:

You need to

  1.  Select BVH as the format
  2.  Select a BVH file
  3.  Create a tag template This will launch the same window as previosuly. This time, you want to tag the BVH you’ve just loaded. Once done, save this one too.
  4.  Select Target Model This should already be the ValveBiped, if not, select it from the Scene Explorer.
  5. Apply.

I’ve uploaded my Valve Biped tag file and by BVH tag file here, as well as a mocap file I took from the site mentioned earlier. Link: http://daft-ideas.co.uk/rnd/Clear%20Skies/mocapeg/

Now, by the power of greyskull, you’ll animate the Valve Biped:

This is rendered from Softimage, because while making this post I couldn’t be bothered to export the skeletal animation into the Source Engine.

I’ll cover how to actually get this into game next post. This was meant to be quick, but oh well. I still need to cover the other method I used 🙂

Making of Clear Skies 3: Animation Perspective.

Clear Skies is a feature-length machinima series from Ian Chisolm, that won a load of awards. You can find them all here: http://www.clearskiesthemovie.com/

It’s filmed in the Source Engine, using the unbelievably powerful but oft-incomprehensible SDK, and Eve Online. All the indoor bits, involving people, are done with Source, and all the space scenes are filmed in Eve.

When the second part was done filming, I was introduced to Ian, who wasn’t sure about making a third, but he did mention the frustrations he was having with the limited animation set of the Source Engine.

I told him that I thought it was possible to create our own animations, even to use Motion Capture to do the hard work for us.

It even turned out to be possible to use completely markerless MoCap.

Shoot3d (now known as Ipisoft), have a markerless system that has a unique advantage: it can export directly to SMD, Source’s uncompiled model / skeletal animation format, making it much easier to import new animations into the game.

Before I tried my hand at doing my own Mocap, I took some BVH files and tried mapping them to a Source Engine Human Skeleton myself, and testing the results out in game.

This turned out to be a daunting task. I knew very little about skeletal animation, and getting the mappings right took some time. I had a small amount of help from http://www.youtube.com/user/mm3guy, who gave me some pointers in getting the BVH working correctly.

My first success can be found here:

Once I’d got this process working, I tried out Ipisoft’s Mocap software.

For this I used:

  • An old mini-DV camera, with bad interlacing
  • A cluttered barn
  • Myself

Which was unfortunate, because they recommend using none of these things.

They recommend:

  • Multiple HD cameras
  • An uncluttered environment with little to nothing in the background
  • Somebody who is not me.

Undaunted, I tried anyway.

The chap visible in the video is Pope, who helped me out on the early tests. He was responsible for herding cats on Clear Skies 2.

We also made him do the robot:

IPISoft did a really good job of interpreting the image, and it’s results can be seen here:

Actually adding this stuff to a model in HL2 is not easy or straightfowards: in some cases you have to decompile the model, which breaks all kinds of stuff, most importantly the facial flexes.

Not ideal when you’re relying on the power of the Source Engine to bring the models to life.

Still, the earliest result I could find is here:

Not bad, for a beginning. More to follow, on the rest of the process.




Leaving Tenerife

Finished my diving trip now, and I’m flying back to sunny Manchester tomorrow afternoon.

I’ve managed to do 7 dives this week, which is three less than I should have done. I had to can a dive this morning because of a equipment malfunction, and I missed a day due to blocked sinuses.

The dives I have done, however, have been very fun. I’ll post some pictures and a more complete log of what I’ve been doing soon.

-Tenerife Scuba, fueled by caffeine.


Made it to Tenerife, and learned a few things along the way:

  1. Don’t forget to check in online when flying ryanair. They WILL charge you for your foolishness.
  2. Get travel insurance BEFORE leaving the country. Most travel insurers will not offer diving insurance on its own. I solved this by signing up with Dive Assure, on  a seven day policy. We’ll see how this pans out.
  3. Don’t dive with the remnants of a cold. No matter how much you think you’re better. You’re not, and your sinuses will punish you for it. Idiot.

I might get to go on a night dive this week. Excited and nervous about it. I might get a cheap underwater camera at some point, and stress test it.