Storytelling in Spec Ops: The Line

Warning: This post contains spoilers for Spec Ops: The Line

In most games,  the designers have done all they can to try to disguise the rails. Rails in this case being a metaphor for linear storytelling. Linear storytelling is not inherently bad, but often seems that way when “you”, or more accurately the character you control, are forced into a decision that the player finds to be idiotic. This breaks immersion.

Good examples of rails can be seen during the Half-Life series, where there are few points that you might feel like you’ve been forced into making a stupid decision. (Well – maybe Gordon didn’t want to jump blindly into a teleporter and go to  hostile alien Zen. But he did anyway…because he was told to).
Every step of your journey is utterly predetermined, but often this goes unnoticed or seems like emergent behaviour. This makes it all the more jarring when you are forced to jump into a prisoner transport pod that immobilizes you and you can’t control it’s direction.

Hop in! It’ll take you to a fun place filled with lightning!

A bad example is Mass Effect 2, where you don’t ever get the option to tell Cerberus to go stick their idiocy where it hurts, but instead you bumble along following the orders of a guy who you have every reason to distrust and hate.

You can’t change anything. Whatever you choose will lead to the next fight scene or set piece. Mass Effect has the worst kind of railroading, because it offers you some choices about who lives, or who you shag, but you can’t make any choice about how your character behaves in-story. Image stolen from 3 panel Soul

Spec Ops: The Line works differently. As already mentioned, most games do their best to present you with the illusion of choice, the try to disguise the rails. Spec Ops instead gives  you the illusion of having no choice, and disguises your choices. The player thinks they are on a rail, but there are many places where this can be ignored.

The one that stood out for me was the point in the game that Lugo was hanged by angry locals (angry doesn’t really do their state of mind justice – the only remaining drinking water in Dubai has been destroyed, and it is all your fault).

Lugo is down, and Walker does his best to revive him. Useless. He’s dead. Adams is surrounded by the mob, who are shouting threats, throwing rocks, getting closer and closer. Adams wants vengeance. There’s no justice; he wants to open fire and gun down the civilians. He’s begging you to make a decision and I start to worry that he’ll just start shooting if I don’t do something.

At this point I was not thinking in terms of “Shooting civilians might be a fail state”, I wasn’t worrying about the game any more. The only thing going through my head was I WILL NOT DO THIS AGAIN. I fired in the air, hoping to drive them away. It worked, and Adams and Walker could continue.

I didn’t think anything of it until I spoke to a friend who finished the game after me.
He said that he’d had to put the game down at this point, he found it too depressing that the game forced you to gun down yet more civilians.

This works heavily in the game’s favour. By disguising the fact that you ever had a choice at all, you can do what feels natural, without ever having to break immersion.

Another example of this is how you deal with the “test” that Konrad sets up. This one more obviously had a choice involved, but even here you can go off the rails – (Konrad’s rails, anyway. Konrad is the GM at this point, in a game-within a game).

Konrad asks you to choose between two prisoners.

The man on the right is a civilian, who stole water. A capital offence, as Konrad remarks. The man on the left is one of Konrad’s own men, who was sent to bring in the civilian for punishment (we all know that soldiers are extremely good at civilian crowd control). During the arrest he killed five more people: the man’s family.

I shot the sheriff soldier (but I did not shoot the deputy)

Later I found out that there were ways around this – you could have attacked the snipers instead, or shot the ropes (triggering an attack by the snipers).

When I first got to this bit, I assumed that it was just the start of a long line of “tests” that Konrad would dream up, to try and persuade you that he was right, and it was the only way to ensure the survival of as many people as possible. I was surprised  then to find that this was it, really, Konrad didn’t have any more moralising to do (well, sort of. I’ll get to that in a separate post).

We’re going to build a UAV

My friend and I want to build an unmanned aerial vehicle. Unlike the predator drones, it will not be armed with missiles.

This is an educational project for us; I’ve never done any proper “bare metal” coding before, or even very much hardware based stuff. Matt, however, is great at the hands on, practical side, but his coding is lacking.

Between us, we should have enough knowledge to fuck up in new and interesting ways.

I’m going to let Matt explain what kit we’ve got, what each bit of it is for, and how he’s fitting it together. I’m going to cover the coding side, and the maths, and how we’re going to get it to fly itself. Or Matt will probably cover that last part; he is doing an engineering degree. I’ll be taking the theory he gives me, and implementing it :).

To start with, we’ll be doing some “basic” stabilisation stuff. We’ll get the plane to try and stay as level as possible, automatically correcting itself except when recieving input from the remote.

Oh – and we don’t have a plane yet.

Raytracer update

When I started this blog, it was intended to be a way to do some write ups of what I’m doing, and to easily share some of the cool stuff with my friends. I don’t like just sending images and builds over skype file transfer, and I like a record. Unfortunatly, my laziness gets in the way a lot.

I’ve made massive amounts of progress with the Ray Tracer, but not put up any images here. Here’s one:

Stanford bunny and dragon. Render time: 60 seconds.

This image here is the payoff. I’ve implemented reflections, OBJ loading, KD trees, diffuse lighting. There’s still a lot to do, though.

The KDtree is what makes all the difference. The Dragon is a very, very expensive model, containing 871414 polygons.

Without any kind of acceleration structure, it takes well over 24 hours to render. (I tried, as a test, to see how long, but gave up after a day.).

Unfortunately, I won’t be able to get much more speed out of Java.

I ported the Raytracer to Java in the first place because I needed to do a complete rebuild, and I’m better at rapid prototyping in Java than I am in C++. It would have been a better learning exercise to do it in C++, but I’m frequently impatient, and I wanted to see cool stuff without having to fumble too much with the language.

Now, I’m probably going to port it back at some point, and eventually use OpenCL to get my GPU to do more of the hard work, but in the meantime there are some more things I’d like to try out.

Partial list:

  • Path tracing.
  • Distributed ray tracing.
  • Fresnel equations (yeah, yeah, this should already have been done.)
  • Programmable camera, so I can do flyby videos.

I’ve already started work on the Path Tracing, with mixed results.

Perlin Noise

Shamus Young was messing around with Perlin Noise, which turned out to be normal noise. I dug around in an old VM to find this image:

Blue and white marble effect. Sort of.

The massive sphere in the background is procedurally textured with the Perlin Noise function, that is described in this paper:

It’s not the best example I ever produced, but it’s the only build of my old raytracer that I could find. Shame on me for poor version control.

The source code for it, in Java, is on Ken Perlin’s website here:

To be honest, it’s a very long time since I’ve looked at that stage of my ray tracer. I ditched the procedural generation of textures because they took way too long, but the basic idea is that you use the “noise” function, which takes X Y Z coords as arguements, and it returns the “next” noise point from the initial permutation list. Or something. These days I comment my own code a lot better, precisely because I can’t remember how this stuff works anymore.

When you’ve got the returned noise amount, you can do cool shit with it. I found a couple of functions around that could turn it into a marble style texture, but you can do a lot more with it.

On Digital Distribution

I hated Steam. Note the tense – I used to hate Steam, and I used to be a pirate.

I’ll explain.

Back when Half-Life 2 came out, I lived with my parents. I was still at school, and we lived on a farm in the middle of nowhere, in rural Wales.

It was a 4o minute bus journey to school in the mornings. (It could be worse – the fact that I got a bus at all was amazing).

We lived in a ADSL black spot. Inititally, there was a distance cap on broadband – if you lived too far from the telephone exchange, you couldnt get it. I think the max was two miles. We lived around seven miles from the exchange.

The next problem was the exchange itself – it was the only one of it’s kind in the country. Back in the eighties, or thereabouts, BT bought a single exchange from some American telephony company, to try out.

For whatever reason, they put it in rural Wales, where traffic low, and many houses didn’t even bother with a phone – they’d trek a mile or two to a phone box.

This thing, this obsolete and foreign monstrosity, was not compatible with DSL. The local loop could not be un-bundled, for reasons that I was never clear on.

Finally, we had a DAC on our line. A DAC was (is? I seriously doubt there are any still around) a money saving device for BT. If BT couldn’t be arsed to run a new line to a new house, or an old house that was only just being connected up, they’d but a digital to analogue converter on the line, effectively turning it into two lines.

This meant two things.

  1. We could only get half speed dial up internet. 26.4 kb/s. That was all we got. Browsing the internet became an exercise in patience.
  2. It made ADSL impossible. BT would have to install a new line.

Now that I’ve set the stage, I’ll explain my early experiences with Steam.

Steam, and biscuits.

I’d been waiting for Half-Life 2 since it was featured in PC Gamer, around two years previosly.

I’d built a PC from scratch, solely for this game. ( I had 1 GB ram. WOW! ).

I’d preorded the special edition, which came in a biscuit tin and had a t-shirt with it.

Of course, I already knew about Steam, and I knew I’d never be able to download the game from there. I’d assumed that buying the boxed copy meant that I could just activate it on Steam, then I’d be on my merry way.

Yeah. Right.

I didn’t quiet get the game on release. I think I got it a day later, or something.

There had already been a minor patch, a measly 4 meg or something. Damned if I can remember. The point was, that after steam had finished unlocking the game, it wanted to download this patch.

I’m sure you remember the earliest days of steam. They had server outages, weird connection bugs, and the awesome “This game cannot be started”.

All of these problems were multiplied my awful internet. Eventually, what I did was pack up my PC, and persuade my Dad to cart me and the computer down to a friends house, who had ADSL.

The cycle went like this:

  1.  Go to friends house
  2. Download HL2 updates.
  3. Return home.
  4. Start steam, without dialing the internet.
  5. Get “Steam could not be started in offline mode” error.
  6. Go online, just to let it connect.
  7. Find out there has been an update between me leaving my friend’s and getting home.
  8. Goto 1.

Of course, it wasn’t this bad. It just seems that way. Steam’s offline mode was and is a bit naff – even if you’d tried to force offline mode, it still might refuse.

Eventually, I gave in. I started to browse the depths of some unsavoury forums, to try and find a no-cd type fix, that would cut out steam entirely.

When I found one, it was glorious. I could enjoy my game in peace, and none of this idiotic always on DRM to get in my way. I thought I’d never use Steam again – after all, it only had one game I cared about, and I’d cracked that.


Sore Feet

I’ve just done the Bogle, which is a 55 mile walk around Greater Manchester. I completed it in 24 hours and 30 minutes, and now I ache. Everywhere. It’d be easier to list the places that DON’T hurt, so I’ll do that:

  • My eyebrows.

At times my pace slowed to around 1MPH. That was awful. Other times, I went on a massive adrenaline and sugar high. That was excellent.

Here’s the route:

The route consisted of two loops, both starting from the North Campus Student’s Union. The south loop started at Friday 2nd March at 7pm, and we were back there by around 7am the next day. Then the northern loop, where most people drop out.

That was one of the most physically and mentally strenuous things I’ve ever done. It would have been so easy to stop at any point and ring the support team, who would have whisked me away to a place with no pain…however, I appear to be a masochist.

I’ve tried this twice before, and failed it both times, so now that I’ve done it I don’t really feel inclined to do it again.

Oh, and this is me just over half way ’round:

Yes, three cups of tea.

Making of Clear Skies 3: Animation Perspective Part 2.

This post is going to quickly go over the two methods I used to actually get the animations into Source.

Method 1: BVH

Ipisoft exports to the BVH format, which is one of the standard motion capture formats. It stands for Biovision Hierarchy, and it was developed by Biovision, who went bust a while ago. This is the best format in my opinion, because it is very easy to read and parse.

A BVH file has two parts:

  1. the hierarchy itself, which describes the skeleton and initial pose.
  2. The transformations that the skeleton goes through to animate it.

The hierarchy starts with the ROOT node, which for humans is usually the centre of gravity. The hierarchy itself is recursive: each node contains data relevant to it (like position, rotation), and further subnodes.

The position of the node is defined by it’s offset from it’s parent, not an absolute position in the world. If it was absolute, it wouldn’t move when it’s parent moved. You’d move your arm, but your hand would remain on the desk.

The rotation of the node is defined as the rotation relative to it’s parent.

There exist a massive amount of tools for reading and manipulating this data, so I didn’t really need to write another one. It was a good job too – I was struggling with the XSI Mod tool.

XSI Softimage is the suite that Valve origianlly used for modelling & animating Half-Life 2. In 2004, when Half-Life 2 was released, XSI had a free, cut down version of the tool for Modders. It was called the XSI Mod Tool.

Valve also created a whole bunch of addons for it, to export stuff to the Source Engine.

Since Softimage got bought out by Autodesk, the free versions are less and less useful, the tools are out of date and features get locked off. You can find the old versions around, but it’s difficult and they are not supported much.

I’ll do a quick tutorial on it here anyway.

Autodesk Softimage Mod tool 7.5

You’ll need to find the ValveSource addons for Softimage, which are either on the net, or in the Source SDK folder.

Open the XSI Scene file “SDK_Male_reference.scn” which can be found in “C:\Program Files (x86)\Steam\steamapps\USERNAME\sourcesdk_content\hl2\modelsrc\humans_sdk\XSI_scn_files” or similiar.

You should get this chap:


He’s very low in IQ and poly count, but he’s lovable just the same, because he’s ideal for testing stuff out.

Press “8” on the keyboard to bring up the scene expolere, and middle click on the “Valve Biped” node.

Middle clicking recursivley selects everything inside the node.

Go to Animate->MOTOR->Tag Rig

This will bring up the tag rig dialogue. It defaults to a human biped, and has  entries for all the major bones.

You need to go through each diagloue, click the bone name button, eg “cog”, and select the corresponding bone on the human skeleton.

Here is an example:

I’ve chosen centre of gravity (cog) as Bip01_Pelvis. Most of them are easy enough to work out, but get it wrong and the animation will be….funky.

Once you’ve finished tagging the rig, SAVE it…it will save you a vast amount of time.

You then need your BVH.

There is a large library of free ones here.

Tagging the BVH rig

Go to Animate->MOTOR->Mocap to Rig

You’ll get this dialgoue:

You need to

  1.  Select BVH as the format
  2.  Select a BVH file
  3.  Create a tag template This will launch the same window as previosuly. This time, you want to tag the BVH you’ve just loaded. Once done, save this one too.
  4.  Select Target Model This should already be the ValveBiped, if not, select it from the Scene Explorer.
  5. Apply.

I’ve uploaded my Valve Biped tag file and by BVH tag file here, as well as a mocap file I took from the site mentioned earlier. Link:

Now, by the power of greyskull, you’ll animate the Valve Biped:

This is rendered from Softimage, because while making this post I couldn’t be bothered to export the skeletal animation into the Source Engine.

I’ll cover how to actually get this into game next post. This was meant to be quick, but oh well. I still need to cover the other method I used 🙂