Very small dogs.

My parent’s bitch, “Ted” just whelped. It was her first litter, and she had 3 live pups. Since the father is called Bill, they watched the film “Bill & Ted’s Excellent Adventure” in preparation. The dogs didn’t pay any attention.

After the first two plopped out (the runt was DOA), Ted apparently lost interest in the whole thing, and stopped. They took her down to the vets, who scanned her and saw that neither of the remaining two were even in the birthing canal. Lazy sods. After being given an injection, Ted managed to have them both in the back of the landrover.

So – the three puppies are called Rufus, Missey and Socrates. I look forward to the day when I can shout “SOCRATES” across a public park.

Going home to Wales on Friday to examine these microscopic dogs in person, and probably make “daaawww” noises.

Here they are:

DSC_0918 DSC_0917 DSC_0916 DSC_0920 DSC_0919 DSC_0913 DSC_0915 DSC_0914

UAV, reading sensor data Part II

In our previous adventure, Matt and I defeated the dragon and rescued a fake princess who turned into a giant spider  managed to get some meaningful data from both the gyrometer and the accelerometer. We also made some pretty graphs.

The next step is for us to compare the outputs from the Gyro and the Accelerometer. This is currently impossible, given that the Accelerometer is transformed into meaningul readings (degrees) already, and the gyro is just a load of “rate of change” readings.

We can get the orientation readings from the Gyro like this:

  //Create variables for outputs
  float xGyroRate, yGyroRate, zGyroRate, xGyroRate2, yGyroRate2, zGyroRate2;
  long Time;
  //Read the x,y and z output rates from the gyroscope & correct for some innacuracy; convert to seconds
  xGyroRate = (gyroReadX())/57.5;
  yGyroRate = (gyroReadY())/57.5;
  zGyroRate = (gyroReadZ())/57.5;
  //Determine how long it's been moving at this 'rate', in seconds
  Time = (millis()-previousMillis);
  //Multiply rates by duration
  xGyroRate2 = -(xGyroRate/Time)/4;
  yGyroRate2 = -(yGyroRate/Time)/4;
  zGyroRate2 = -(zGyroRate/Time)/4;
  //Add to cumulative figure
  if (((xGyroRate2)>(gyroLPF))||((xGyroRate2)<(-gyroLPF)))   CumulatGyroX += (xGyroRate2);   if (yGyroRate2>gyroLPF||yGyroRate2<-gyroLPF)   CumulatGyroY += (yGyroRate2);   if (zGyroRate2>gyroLPF||zGyroRate2<-gyroLPF)
  CumulatGyroZ += (zGyroRate2);

Now that we’ve done this, we can measure an axis from both the Gyro and the Accelerometer at the same time, and overlay them on top of one another in gnu plot.

Like so:

Yeah, that didn't really work.

Yeah, that didn’t really work.

Ok, so there was something wrong with that. We tweaked the code some more, and got something a bit better:

No, that's worse.

Much worse.

Finally, we figure out where we went wrong with the code. We don’t have the old versions, so we can’t show you our idiotic mistakes. After fixing it, we get something much closer to what we were expecting:

Roll measurement from Gyro and Accelerometer

Roll measurement from Gyro and Accelerometer

We can see from this graph that both sensors are outputting more or less the same thing. However the gyro measurements are actually off by a bit at the start, and are slowly producing a more noticeably incorrect orientation. If we used just the gyro, we’d end up with the plane downside up. There are also “spikes” in the accelerometer readings, at 700 and 1300 – these are probably noisy readings from the sensor.

To get an idea of how much the gyro readings drift, we turned on all the gyro sensors and then left the board still for a few seconds:

Gyro drift in three axes.

Gyro drift in three axes.

This is obviously not going to work long term – we can’t be certain of either sensor. We need a way to combine the readings from both sensors, or face the consequences.

UAV, reading sensor data

A long time before we did our session where we controlled some servos, we had a session where we were reading sensor data.

We’ve tried this a number of times, but this instance was where we actually started to get worthwhile data, and we could interpret it correctly.

We are currently reading two sensors to find orientation data: the 3 axis gyroscope and the 3 axis accelerometer.

I’ll cover the accelerometer first.

Reading Accelerometer Data

We’re using a 3 axis accelerometer  which if read correctly can tell us the pitch and roll of the aircraft (actually, just the pitch and roll of the sensor, but the sensor will be built into the aircraft).

Each axis in the accelerometer (we’ll call them accX, accY and accZ) tells us how pull that sensor is experiencing (at rest, this is due to gravity). We only need 1 axis to start with, to measure Pitch.

PitchIn this crappy diagram, the rectangle represents our accelerometer. It’s measuring apparent gravity. As the acceleromter has its pitch increased (rotated clockwise, in this case), the apparent gravity being measured in that axis will decrease.

Theta represents the angle between the apparent gravity and actual gravity.

The trigonometery to calculate pitch is pretty easy:

apparent gravity = cos (theta) * g


acc = cos (theta) * g

The relationship between theta and pitch is very simple: simply add 90 degrees to theta to get pitch.

So, rewrite

apparent gravity = cos (theta) * g 


theta = acos (acc / g)

And then to get the actual pitch:

pitch = asin (acc / g)

//Accellerometer Read and Output Section
  float xAccRate, yAccRate, zAccRate;
  double pitch, roll;
  xAccRate = (accReadX());
  yAccRate = (accReadY());
  zAccRate = (accReadZ());
  double measured_g = sqrt((xAccRate*xAccRate)+(yAccRate*yAccRate)+(zAccRate*zAccRate));
  roll = (atan2(xAccRate/128,zAccRate/128))*(180/3.141);
  pitch = (atan2(yAccRate/128,zAccRate/128))*(180/3.141);

Reading Gyro Data

Gyro data is actually a lot easier to read, in some ways. All a the gyrometer is doing is measuring rate of change since the last reading.

Therefore, all we need to do is integrate over a period of time to get the rotation in a certain axis.


//Gyro Read & Output Section
  //Create variables for outputs
  float xGyroRate, yGyroRate, zGyroRate, xGyroRate2, yGyroRate2, zGyroRate2;
  long Time;
  //Read the x,y and z output rates from the gyroscope & correct for some innacuracy; convert to seconds
  xGyroRate = (gyroReadX())/57.5;
  yGyroRate = (gyroReadY())/57.5;
  zGyroRate = (gyroReadZ())/57.5;
  //Determine how long it's been moving at this 'rate', in seconds
  Time = (millis()-previousMillis);
  //Multiply rates by duration
  xGyroRate2 = -(xGyroRate/Time)/4;
  yGyroRate2 = -(yGyroRate/Time)/4;
  zGyroRate2 = -(zGyroRate/Time)/4;
  //Add to cumulative figure
  if (((xGyroRate2)>(gyroLPF))||((xGyroRate2)<(-gyroLPF)))   CumulatGyroX += (xGyroRate2);   if (yGyroRate2>gyroLPF||yGyroRate2<-gyroLPF)   CumulatGyroY += (yGyroRate2);   if (zGyroRate2>gyroLPF||zGyroRate2<-gyroLPF)
  CumulatGyroZ += (zGyroRate2);

Pretty simple 🙂

Some pretty graphs

Of course, that code took us a long time to actually write. And we may have borrowed some from elsewhere, I can’t for the life of me remember. Once we’d got it working, we plumbed the serial output into GNUPlot, so that we could get a visual representation of what was going on.

The general idea was to tilt the board by 90 degrees in once axis, then return it to it’s original position. This would produce a sine wave, eg:


After mucking about for a bit with minicom, I used this command to capture the serial output:

sudo minicom --capture mincap

The serial output just looks something like this:

ID: 69
2 0 0
-32 -4 -27
-1 -1 1
0 -1 1
0 0 1
0 -1 0
0 1 0
0 1 0
1 -1 0
1 -1 0
0 -2 0
0 -1 0

Once we’d got that, I spend yet more time mucking around in gnuplot. Actually, I jest. It was pretty easy to plot a graph.

The first graph we got was this:

We'd done something very obvious and simple wrong.

We’d done something very obvious and simple wrong.

Spot the deliberate mistake.

Upon seeing this, we retired to the pub. Over a pint or three, we realised what it was we’d done wrong.

The Arduino board has an integer size of 16 bit, meaning it has a max size of 32,767. If you try to do

int i = 32767 + 1;

then i would have the value -32767.

And that’s exactly what was happening. We’d used an Integer, because occasionally I’m as thick as a brick sandwich.

We weren’t doing anything with the data yet to convert it to degrees, and we weren’t even sure how big the numbers would be, and we were just adding it all up.

Here’s another graph showing the same thing, but with the dots joined up:


After we started using the (entirely more sensible) long data type instead, we got what we’d originally expected:


So, as predicted, we got ourselves half a sine wave. If we tilt it by 90 degrees in one direction, then back to flat, then 90 degrees in the other direction, we get the full sine wave:

If the graph is a bit lumpy, it's because Matt had the shakes.

If the graph is a bit lumpy, it’s because Matt had the shakes.

In the next post, I’ll talk about working out the actual orientation, and sensor drift.  We’re also going to compare the output from the accelerometer and gyro.

Arduino: Controlling Servos

Last Sunday, Matt came to visit. He brought his MAN TIN along, which is filled with micro controllers and breadboards and miscellaneous wires and a machine that goes “Ping”.

We’ve not done any UAV work for a very long time, in fact the last time we did any was last Autumn. This has actually been my favourite session to date, because we made things move!

We’re going to be using servos to drive our control surfaces on the UAV, and this was a sort of Proof of Concept in making the things work.

We built the board, and hooked up the servos. The arduino program actually has some sample code for servos built in, so, not knowing how they work, we just hooked up that sample code.

All it does is make the servo rotate 180 degrees in one direction, then 180 degrees in the other direction, over and over again.

Here’s the code from the Arduino website. It’s not identical to ours, because we changed the pins, and then experimented with the speeds a bit. The speed was easy enough to alter: just change the delay between moves.

// Sweep
// by BARRAGAN <>
// This example code is in the public domain.
#include <Servo.h>
Servo myservo;  // create servo object to control a servo
// a maximum of eight servo objects can be created
int pos = 0;    // variable to store the servo position
void setup()
myservo.attach(9);  // attaches the servo on pin 9 to the servo object
void loop()
for(pos = 0; pos < 180; pos += 1)  // goes from 0 degrees to 180 degrees
{                                  // in steps of 1 degree
myservo.write(pos);              // tell servo to go to position in variable 'pos'
delay(15);                       // waits 15ms for the servo to reach the position
for(pos = 180; pos>=1; pos-=1)     // goes from 180 degrees to 0 degrees
myservo.write(pos);              // tell servo to go to position in variable 'pos'
delay(15);                       // waits 15ms for the servo to reach the position

Here’s a video of it actually working:

Half-Life 2: Point Insertion Part I


Recently, Valve released Half-Life 2 and it’s episodes for Linux. For me, this is more or less a dream come true, something I’ve been waiting and wanting and wishing for since I started using Linux in 2006.

I’ve tried Wine, multiple times, and it’s always been a bit too quirky, with too many peculiar bugs and foibles. This is not to say that I don’t find Wine impressive. Wine is probably one of the more audacious projects I’ve ever seen – an attempt to reverse engineer one of the most obtuse black boxes ever, the Windows API.

So now that Half-Life 2 has been natively released for Linux, I decided it was time to do a complete replay of the game. I’ve played the game a lot, but I don’t think I’ve actually sat down and gone end to end for a long time. I always skip bits, replay certain sections and then stop.

This time I’m going to share my experiences, via pretentious blog posts, probably on THE INTERNET (but not necessarily  I’ll accept requests to have pages posted to you, if you first of all send me a written request via the royal mail).

I’m doing this not because I think I’ve got anything interesting to say, or anything new. Everything has been more or less said about the game at this point, and if I ejaculate all over it I don’t think it will make much difference to the world at large.

No, I’m doing this because I want to and because I’m a mouthy git, and because this is the internet I can do what I like.

Technical Specifications

I’m playing this on a Samsung Series 7 Chronos, which is an ultrabook type thing. It’s got:

Xubuntu 13.04 x64

Intel Core i5


nVidia geforce 630M

Although the nVidia bit is actually Optimus, and I get way better performance out of the Intel bit than I do with the discrete nVidia chip. Still working on that.

And away we go!
Continue reading

On Linux Adoption

Today something very exciting happened. Half-Life 2 and its episodes are now available for Linux.

This year has been great for Linux gaming, and I’m managing to move further and further away from Windows as my primary OS.

My laptop dual boots Windows 8 and Xubuntu 13.04, and my desktop dual boots Windows 7 and Arch Linux.

On my laptop, I never boot Windows at all any more. I bought it as a lightweight gaming machine that I could use when I travel for work, and I’ve found that most of the games I’m playing at the moment have Linux releases (Kerbal Space Program is currently my most played).

My desktop I’m currently playing Far Cry 3 on, so obviously that needs Windows, but now that I’ve got Left 4 Dead 2 and Counter-Strike Source, and with Garry’s Mod on the way too, my reasons for staying in Windows are shrinking.

I announced my pleasure to a “community” of people I hang out with on Skype quite often, and there I faced the usual amount of disdain and apathy.

But I also encountered an arguement against Linux that I’ve not actually come across before, and while it is indeed born of stupidity (and demonstrably wrong in some cases), it nonetheless intrigued me.

It went like this:

There is no killer app for linux

That’s not what they said, but that’s what they meant. They argued that there was no incentive for them to actually use Linux because every tool they might use on Linux already had a windows version.


That’s a seriously weird thing to argue. Generally, you want to reach as wide an audience as possible with your software. Locking yourself to a particular platform, unless you are the platform vendor, is going to be suicide.

Unless you are a particularly niche product, or you are building bespoke software for a client that is running a particular OS, why would you want to lock yourself to a platform? Cross platform development has never been easier, and more and more software companies are starting to bring out versions of their software for more than just Windows.

Leaving linux out of it for a moment, I’d consider that most consumer software these days as being multi platform. Good grief, even Microsoft release Office for Mac.

Apple release iTunes for Windows! Why on earth would they do this? Wouldn’t releasing it for Mac only make more people use Macs?

No. Because while a lot of people can afford an overpriced undersexed mp3 player, they can’t also afford a brand new computer just to use the damn thing. And Apple know this, hence the release of iTunes for Windows.

So that aside, very few people are going to switch to linux because of a particular program they need that has been released for linux and no other platform. At least, most average computer users.

His next argument said “well, I don’t think linux is ready yet. No one distributes it on new computers”.


And the concept is not new. Dell have been doing this for a few years now.

The next arguement was “Well only computer geeks and nerds will use it”.

I think he seriously underestimated how many geeks & nerds exist on the planet. There are millions of us!
Do we not deserve to choose? If your average consumer can choose between Windows and Mac, can’t we choose something different?

If you go by the numbers in the Steam Hardware and Software survey, at the moment there at around 1.3% of Steam users on Linux.
Steam users peak at around 5.3 million a day.
That’s nearly 70’000 people using steam on linux (at the time of the last hardware survey). The actual numbers may differ.

Valve obviously thinks its worth releasing games for us. If you look at the average price breakdown by OS for the humble bundle, you can see that Linux users might be a minority, but they’re a minority that pays well for games:

Screenshot - 100513 - 18:13:02

The arguement that only nerds will use it….well. My girlfriend prefers Linux over Windows for a lot of things, and only really stays on Windows for some academic software and iTunes.

My sister used Ubuntu, but only until she immersed her laptop in tea. She got on with it.

My parents use it.

My brother uses it. He’s the first to admit he’s not technically minded.

A lightweight distro works extremely well for people who only use their laptop as a facebook machine.

Come to think of it, the guy was probably trolling me. He succeeded.

Anyway, this post gave me something to do while waiting for Half-Life 2 to download.

Screenshot - 100513 - 18:17:45

Grub EFI Dual boot errors

I recently decided to move install Xubuntu on my desktop, having gotten fed up of Arch Linux. Arch Linux will make it’s return, but I was having too much trouble with the AMD legacy drivers. When I’ve upgraded my graphics card (hint – for one that doesn’t have shit linux support), I’ll move back.

In the meantime, I had some problems with Grub.

To begin with, the install did not recognise my Windows partitions at all, but I could still boot from the EFI menu to Windows. I would rather have grub though, instead of having to hammer away at the F8 key on every cold boot.

According to the Ubuntu page about UEFI, you should use boot-repair.

I used boot repair, and the output from it can be found here.

This updated grub, found the Windows partition, and added Windows as an option (actually 2 options for Windows. Don’t know why, but there are multiple EFI files for Windows on the EFI partition).

I got this error:

error: no such file or device: 16E0-4903
error: no such file or device: 16E0-4903
error: file '/EFI/Microsoft/Boot/bootmgfw.efi' not found
Press any key to continue

The grub entry for Windows looked like this:

menuentry "Windows 7" {
search --fs-uuid --no-floppy --set=root 16E0-4903
chainloader (${root})/EFI/Microsoft/Boot/bkpbootmgfw.efi

I checked the EFI partition, and sure enough the file is there. I checked the UUID of sda1 (the EFI partition), and it was correct.

I tried to fix it myself, by using the hard drive name instead of the UUID, but that gave errors about partition type.

Turned out I needed to add this:

menuentry "Windows 7" {
insmod part_gpt
search --fs-uuid --no-floppy --set=root 16E0-4903
chainloader (${root})/EFI/Microsoft/Boot/bkpbootmgfw.efi

I found the information here.

I don’t know why that line wasn’t added, but there we are.


So, I added new ram to my build today.

I had to do a cmos reset to get the settings right, more fool me.

What I wasn’t expecting was this, when I booted up:

Yep. Origin relies on your system clock to tell you if the game is released or not. I wonder if this works the other way around: if you’ve pre-loaded something, can you play it if you mess with the system clock?

Ink Part 2

Canny Edge Detection

The first stage for me is going to edge detection. There are many edge detectors, but I’m going to use the Canny Edge Detector, because I’m vaguely familiar with it and it’s quite well regarded.

Here’s the image I’m going to use for the initial development:

This handsome face *cough* is about to be melted down and turned into a bunch of squiggly lines. It was taken with my webcam in a partially darkened hotel room. On a Sunday. I also look a little shocked for some reason.

Step 1: Noise Reduction

To start with I’m going to blur my image. This might seem a bit counter intuitive, but it’s actually very helpful in cutting down the amount of noise in the picture. I mean, look at it. There’s randomly coloured pixels all over the place, due to poor lighting and a poor webcam. This kind of thing is going to cause interference in the various algorithms I’ll be using. I’ll show off why in a bit.

I’m going to use a very basic “box blur” kernel, not a fancy Gaussian one. Mainly because it’s easier and I’m lazy.

I’ll be using a 3×3 convolution kernel, like this:

Box Blur Kernel

Box Blur Kernel

This gets applied to an image by moving the centre of the kernel along each pixel in the image, and multiplying the kernel with the window in the image. The result is then summed.

Obviously, you have to deal with edge cases. I’ve taken the easy way out and _not_ dealt with them. I’m only blurring the pixels that are not at the very edge, which leaves a strip around the image that is unprocessed, one pixel thick. Visually, this doesn’t matter- I can leave it there or reduce the image size by 2 pixels in the x and y direction.

Original image has been desaturated and blurred, to try to cut down on the noise.


Step 2: Get Gradient Intensities:

I’m going to use a basic Sobel filter to perform edge detection. Sobel detects rapid intensity changes in a specific direction. In fact, you need a Sobel filter per direction:


This filter is applied in the same way as the box-blur filter described earlier. Here’s the output:

Gradient intensities in the horizontal direction

The test image with intensities in the Y direction calculated

These two results images are actually from before adding the box-blur filter. I’m doing this whole writeup in the wrong order.

The two gradient intensity images are summed together to get the final gradient result, using this very simple formula

|G| = |Gx| + |Gy|

That is a shortcut from doing the full equation of:

G = SquareRoot(Gx2 + Gy2)

This equation, applied to every pixel in both images, gives this result:

A basic gradient intensity image. By itself, this doesn’t do much, but I like to get tangible output from these algorithms. It’s a good visual lesson for what I just did.

That’s it for now. In the next post I do on this, I’ll be doing:

  • Edge direction calculations
  • Thin edge detection
  • Edge following


Overambitious Project: Ink

I’ve got that itch again. The one that sits at the back of my head going “you haven’t done any cool programming in ages, don’t you think you should?”

It’s an urge to go and work on a personal coding project, where I can experiment with stuff I find interesting, and can set my own pace and goals. This has become especially important recently, since I don’t even do any coding for my job anymore. I’m moving over to a managerial/everyman/”guy who knows stuff” role, so It’s not just a creative urge, it’s an urge to keep my “skills” sharp.

*Disclaimer: I’m not an amazing programmer, but I enjoy problem solving and I like to think I’ve built some cool stuff*

This project is going to be partly a rehash of my Computer Vision project at University – at least, it will use some of the same algorithms and technologies, and a large part of it will be about extracting a “useful” feature set from a set of images. What I do with that feature set is going to be very different though, and things I considered to be “useful” in my old project will probably be very different.

Here’s the abstract from my dissertation:

Reconstruction of 3D Models From 2D Images.
This project is about trying to recognise “interesting” features or points in two dimensional images, and then attempting to find corresponding features in a different image of the same scene. The coordinates of these points in different images can be used to generate 3D coordinates using Ullman’s theorem. This paper explorers a variety of options for detecting features, and several different methods for matching these features over different images. The variables used to generate and extract features are thoroughly tested, in order to find the best settings for the system. The results are then compared to the initial project specification to see if the system can operate as needed.

Now, this is not what I intend to do here at all. I just posted it anyway.


Ink, Project goal:

Write a program that will take video, and turn it into what looks like a series of hand-drawn ink sketches.

A still might look like this:

An ink drawing I found on the internet somewhere.

The animation should have enough flaws in it to make it seem hand-drawn, and it will probably have a much lower frame rate than the original video.

Part of my inspiration for this is Raymond Brigg’s “The Snowman”


I love the way the crosshatching works on this. The animation is low frame rate, the background is mostly static, but the movements are complex and the shading is wonderful.


So I want to achieve that programatticaly. You heard me. Any artist who reads this will probably tell me I’m removing all the soul from the animation, and they’d probably be right, but this is a project that I want to do.

Another inspiration is “A Scanner Darkly”, a film that I love for many, many reasons, not least because it’s adapted from a Phillip K Dick novel, and all adaptions of his work to film have been fantastic in their own ways. Even Total Recall is brilliant, but not for the same reasons Blade Runner is.

Well, maybe not.

But aside from the brilliant plot, (which, to be honest, was mostly about following stoners around. And paranoia. And government surveillance  And pharmacological conspiracies. And insanity. And psychotic breaks), it was produced in a unique way.

It was filmed normally, then every frame was redrawn partly by hand and then animated the rest of the way. Visually, it’s stunning, but the animation is so realistic that it creates a bizarre disconnect in your head while you watch it, and in places you can forget it’s animation at all. It very much suits the subject matter of the film.


And then there’s this clip, which I’m including for no reason other than the fact that it’s funny. Sort of:


Your sins will be read to you ceaselessly, in shifts, throughout eternity. The list will never end. 

So I want to try and build something similar to the system they used here. (They called it “rotoshop”, and they never released the program). It won’t be so fully-featured (damn, there’s an enterprise-y word. And another one! Auugh, what’s happened to my vocabulary?), because I will probably run into problems and get bored or frustrated. Besides, I’m not trying to accomplish the same thing, but I suspect some of the methodology will be similar.

So, here it goes.