FreeNAS Corral and Docker

This post was written before FreeNAS Corral got memory-holed. I’ve not decided on my course of action yet. I like running my services in Docker far more than I ever liked JAILS and the LinuxVM, so going backwards is not really an option.

 

I recently upgraded my FreeNAS 9.3 box to FreeNAS 10, FreeNAS Corral. The upgrade was very smooth, although I did panic at one point when it took 500 years to boot back up.

I knew that the old Plugin system had been removed, and that Docker Support was the new way to get your plugins working. I had several plugins I needed:

-Owncloud

-Transmission

-LinuxVM

The Linux VM plugin, or Jail, was what was letting me run an Ubuntu Server to host Squeeze Server. The Linux VM VirtualBox was not available any more, so the Squeeze Server would have to find a new home. Fortunately, there is a Docker Image for Squeeze Server.

Owncloud

Getting an Owncloud container running was pretty trivial. I did the following:

Create a new Dataset, for Owncloud Data to live in

Screenshot_2017-04-03_13-48-56I gave ownership of it to “www” user and group.

Create the Owncloud Docker Container

Screenshot_2017-04-03_13-51-38I needed to do the following:

  • Give it a name
  • Map the Dataset to the container path
  • Set the Container to Bridged Mode. This is definitely possible with NAT, but I prefer having unique IP addresses for each container.
  • Save the container, and start it up.

Once Owncloud has started up, I let it do a first time setup, letting it host the database itself, etc.

Migrating the Data from the Owncloud Jail

I needed to use the command line for this.

First, stop the container.

Copy the owncloud Jail data to the new Dataset.

My owncloud data was in /mnt/<Pool>/jails/owncloud/media

This location contained the User’s directories and the database, but not the config file.

Copy everything in here to your new Dataset, into the “data” folder:

/mnt/<Pool>/<Dataset>/data

That’s the data, but you need to edit the Config file as well. The config file for the new owncloud install is here:

/mnt/pool1/ownclod_temp/config/config.php

You’ll need to replace this with the older config file from the plugin, which is here on my setup:

/mnt/<Pool>/jails/owncloud/usr/pbi/owncloud-amd64/www/owncloud/config/config.php

You might need to update the trusted domains to include your new Owncloud Docker IP (or the FreeNAS IP and Docker Port, if you’re using NAT).

After you’ve done this, you should be able to start the Owncloud Container and log in using your original credentials.

SSL, Let’s Encrypt, Reverse Proxies

This was hard work. The actual solution is not complicated, but there’s no single guide to tell you how to do it.

Nextcloud recommends that you use a Reverse Proxy to get SSL working for your NextCloud container. Owncloud is pracitcally the same thing, so it seems like the thing to do in both cases.

I use Letsencrypt for my certs, and previously I ran certbot from inside the Owncloud Jail. I initially assumed I would do the same thing here, but luckily for me the FreeNAS docker collection includes a Letsencrypt Image that includes NGINIX, which would let me set up a reverse proxy as well.

Things to keep in mind before you start:

-Port forwarding on your Router should already be set up for the domain name you want to create the Certificate for.

-you need a writeable area for the certificates. I have a dataset setup with “www” permissions specifically for this.

 

 

Data

Data.

What data do I consider important?
I had a backup scare a while ago, thinking I’d lost several years’ worth of photographs. It looks like I did have some level of backup, but not all of them were retrieved.
So I suppose Photographs are one thing I feel I need to backup. What else?

Files?

What kind of files am I even thinking of, I wonder? I haven’t tried to write any short stories for years, and spreadsheets I created with Sarah to do things like plan chores don’t really seem backup worthy.
Personal projects, I suppose would count, so that would be the various little programming projects I’ve started and abandoned in the last decade.
My university work, and most importantly my 3rd year project. I worked hard on that, and even if it’s unlikely I’ll be ever doing anything with it again, it would be a shame to lose it.

Music

I’ve got a lot of music files. A lot of these are ripped from CDs, and from my Parent’s CDs. Some of these are files I recorded from records, some are things I was given by friends, some of it is Sarah’s. Replacing the actual files would be a pain in the arse, re-ripping everything would take several days of solid work, and some of the CDs are in poor enough shape that I’m not sure they’d rip again anyway.

Videos

I’ve made a few videos, just short personal things mostly. Most of these are on Youtube anyway, since that was the primary mechanism I was sharing things with friends anyway. But if I should lose my Google account, or google went under or something (not that it’s likely to happen).

Paranoia

A long time ago, a member of family asked me if I still liked Google. This was just after Google had started to really take off with their suite of applications, and Google Mail had started to become such a mainstream player.
At the time, they were poking fun at my anti-Microsoft attitude and my dislike of massive IT companies. They were right to do so; I was probably the kind of person to spell Microsoft with a dollar sign.
But as the years have gone on, it’s difficult to ignore that Google (Alphabet) has become a deeply scary company.
Even if they weren’t, it’s probably pretty foolish to have so much of my identity tied to a single service, and it’s also amazingly foolish to keep any important information in a free online service. I know, since I’ve been that fool before: when Hotmail addresses were migrated to outlook.com addresses, any “inactive” accounts were purged of all old emails if no-one log in via web browser happened within a number of days. I’d had that email address 10 years, and suddenly everything on it was gone. It’d be daft to assume that Google never ever EVER has that kind of thing happen.
So, I suppose I’m a tech version of a prepper. There might already be a word for that, not sure. Hopefully I won’t be suspected of being a new Unabomber or anything.
So, my data.

Backup solution.

The saying goes “3 copies, 2 mediums, 1 off site backup”. Currently, as of writing, I have one “master” copy on my NAS, but that’s only got a single hard drive in it. This means it barely qualifies as a NAS at this point. Sure, it’s storage attached to the network, but only a fool would actually host data on it (Yeah I’m a fool).
A 2nd hard drive has been ordered and dispatched from some enormous warehouse, so soon that single hard drive will be mirrored, making me a bit less foolish.
Still, that’s only 1 medium. I’ll need to think a bit about what might constitute a 2nd medium.
The off-site backup is going to be handled in 2 ways. First of all, I’ve subscribed to Crash Plan. I’ve not actually uploaded any data to it yet, since it’s been a bit tricky to set up on FreeNAS. This will be my cloud backup for now, although I’m open to switching cloud backup providers at any point.
The 2nd off site back up will be a similar NAS box running at my parents’ house. I’ll dedicate some time to getting that set up when I visit them at Christmas. It can store their data, and I’ll rsync my data to it, and their data to mine.

The Paranoid bit

Well, I don’t think it’s really paranoia. But I’m not going to be entrusting anything more to Google from now on, and I need to apply that retroactively to their services. This means I will need to get down from their servers everything I’ve ever uploaded, and also take a copy of every email I’ve ever sent or received through their services. This will all go into my backups as well.
I will also need to remove any DRM from any ebooks or films or music I’ve ever bought from them too. This might be a bit tricky.
Of course, this won’t happen overnight. I’ve already begun migrating my emails away from Google, but I still use them as my primary contact for a lot of online services. Amazon for example, uses my Gmail login. I suppose I want to get things that are actually irreplaceable away from them, so that would mean family and friends can have my non-gmail related account, and all of the faceless companies that want my email address for spurious reasons can have the gmail one. In this fashion, I will fill my gmail inbox with SPAM.

FreeNAS Setup

Installing FreeNAS.

I’d read on various blogs and how to sites that you should install FreeNAS to a USB stick, and boot from that. This leaves hard drives free to be proper network attached storage, and not have to use a partition on them just for the OS.

What I ended up doing was use VirtualBox with USB pass-through to install a VM onto a USB thumb drive. The USB stick I tried was an old Sandisk Cruzer Slice I’ve had since 2010, and has usually been my OS install disk since I don’t have a DVD-ROM drive any more.

While booting I was having no end of problems. It took a good 10 minutes to go from POST to grub, and after Grub it would drop to the “mountroot” screen. The mountroot screen looked a bit like this:

installerrror

This lead me down a rabbit hole where I started to wonder if maybe there was something wrong with the server I was using.

After several days head-scratching, I swapped to a different, newer USB stick, which worked fine. I guess I must have ruined the Sandisk over time.

This little cube sits underneath my whisky cupboard, silently plotting to back up files

This little cube sits underneath my whisky cupboard, silently plotting to back up files

Share Setup

At the moment, the NAS contains only a single hard drive, which is only 350GB. The plan was to use this to experiment with, and once I’m happy, to replace it with several large capacity drives in some kind of RAID configuration.

Even with this limited capacity, it’s actually still enough to store all of my photos and all of our music, so it’s a good test of the system’s capabilities and a good way to muck about with the various share options.

Music: CIFS read/write, and NFS read.

Films: NFS read/write

Photography: NFS read/write

 

Music needs to be CIFS read/write because Sarah’s laptop is where all digital music is managed, because of Itunes (we hates it FOREVER).

It needs NFS read so that Kodi, running on the Raspberry Pi, can access it and NFS is faster for reads over the network than CIFS is.

 

Nixie things

With my missus’s birthday rapidly approaching, I have decided to manufacture her something neat.

To that end…
A NIXIE CLOCK.

“Nixie” tubes are a brand-name for a cold cathode glow discharge device used for a relatively short period around the 80’s to display numbers, letters and symbols (before LED technology reached maturity). They are only in production by a few die-hard enthusiasts, but ex-Soviet stock is reasonably readily available on the Internet.

They look something like this:

Nixie

The purpose of this post however, is as follows.
As a byproduct of the majority of available Nixies being Soviet-made, their datasheets are Cyrillic.

And I have translated the one for an INS-1 ‘indicator’ point (not a true Nixie lamp, but a start).

So here it is, take from two documents- one that ships with each batch (firing voltages and current), the physical properties taken from another:

INS-1

Gas discharge lamp unit is designed to display information in the form of a point in the information display.
The body is cylindrical glass. Weight no more than 1.5g

Cathode marked with a dot (NOTE- this contradicts most of what I have read on various forums! However,  the datasheet clearly has “анод”  (anode) and “катод” (cathode) marked, with the dot on the cathode)

 

Instructions for Use

Firing voltage, V, Min (Max) .  .  .  .  .   .  .  .  .  .  .  .  .  .  65 (90)

Sustaining Voltage, V, Max .  .  .   .  .  .  .  .  .  .  .  .  .  .  .  .  .  55

Current, mA .  .  .  .  .  .  .  .  .  .   .  .  .  .  .  .  .  .  .   .  .  .  .  .  0.5

Vibration loads:

Frequencies, Hz  .  .  .  .  .   .  .  .  .  .  .  .  .  .  .  .  .   .  .  . 1-1000
Acceleration, m/s² (g), no more  .  .  .  .  .  .  .  .  .  .  .  .  98 (10)

Multiple impacts:

Acceleration, m/s² (g), no more  .  .  .  .  . .  .  .  .  .  .  .  147 (15)
Impact duration, m/s  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  . 2-15

Single shock:

Acceleration, m/s² (g), no more   .  .  .  .  .  .  .  .  .  . 1472 (150)
Impact duration, m/s  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  .  . 13

Temperature (environment), ° C.  .  .  .  .  .  .  .  .  .  . -60 /+ 85
Relative humidity,%, not more   .  .  .  .  .  .  .  .   .  .  .  .  .  .  98
The increased air pressure, Pa (kg/cm²) .  .  .  .  .  .294 198 (3)

 

FreeNAS backups and share

I’ve been thinking for a long time about solidifying and formalising my backup procedures. To be honest, my existing system is pretty awful, even though it’s caused me problems in the past.

Existing Backup solution

On my desktop, I have an SSD where my operating system (Xubuntu) lives. This drive is 250GB. I have an HDD, which is 360GB called “coffee”, which is where I store photographs, music, videos, download, etc. MOST of what I download ends up here, but some ends up on the SSD when I forget to move it.

I also have numerous external hard drives, where a collection of media I’ve built up over the last decade lives, in a pretty awful folder structure. Some of this is duplicates, some of it isn’t, some of it is corrupt, some of the hard drives are dead.

I know for a fact that the majority of photographs I took between 2006-2010 are gone forever, because the drive I was storing them on died.

Sarah’s files

Sarah’s laptop is slightly better in some ways, and much worse in others. It’s better because File History in Windows 10 is turned on, but worse because:

  • -The drive that the file history is on is one of my many external hard drives
  • -We never connect the drive
  • -Her filing system is non-existent.

*It’s really bad. Her Music is so scattered and so poorly named that it’s going to take me a while to get it sorted out. I’m going to try out a few different tools for automatic tagging and renaming, see where it gets us.*

 FreeNAS & HP Proliant Microserver

I’ve played with FreeNAS before, and I love how it works and what it can do, but I’ve never had any dedicated hardware to run it on. That’s changed now that I’ve bought one of these:

c03760124.png (474×356)

I’ve begun setting it up and getting a working FreeNAS install on it, but I’ve got a lot of thinking to do about how it’s going to all work and fit together.

I’ve got a few basic requirements:

  1. Have regular backups from Sarah’s laptop
  2. Have regular backups from my desktop
  3. Have a network share configured on it for all of our pictures
  4. Have a network share for all of our Music
  5. Install the OwnCloud plugin
  6. Our phones must auto-backup images to Owncloud
  7. Owncloud must somehow share the storage space so that our images are visible from Owncloud and our phone images are visible on the network share

Once this is all done, I want to be able to do the following:

  1. Access the music shares from Kodi, on my raspberry pi. The Raspi is hooked up to my amplifier, and I want to control what it’s playing from my phone.
  2. Access Owncloud from the internet. This will be a small project all on it’s own

I need to make a few decision about how the shares will work. If I’m already backing up everything from Sarah’s laptop, do I want to use Windows File History to make the backups, or something else?

If I use File History, then it’s going to dump all versions of everything in a network share. I don’t really want all the old versions to be visible to every other device (and Owncloud) that is using that particular share or dataset.

 

 

Steamleft II

A while ago I discovered Steamleft, and decided to use it to try and track my progress in finishing my backlog of Steam games. There are some major flaws with it, such as it counting games I’ve played but not on Steam (eg, Half-life: Blue Shift). It also has really really high estimates for how long it should take me to “complete” Dota 2, a game I got for free and despise (because I suck at it).

Every now and then I run GNUplot over my logging file of hours left in Steam. Here’s a sample:

test2

The data points between between May 9th and May 16th are probably when my estimated hour to completetion were modified by a free trial, or something.

The sudden drop off in the back end of April, I’ve no idea what caused that one.

Here’s a newer graph:

chart

 

The sudden drop off is again probably from a game I had for a weekend or something. I have bought a few games since I started this, but not too many.

Steamleft

Somebody sent me to this tool.

If your Steam Profile is public, it will show you how long you’d need to play for to “complete” all games in your steam library.

My own entry is here.

It relies on average game length statistics from www.howlongtobeat.com , which will obviously be a bit hit and miss with games like Kerbal Space Program. For open ended sandbox games, like KSP, what counts as beating it?

Actually, I’ll go and look it up:

howlongtobeatKSP

 

Well.

I suppose KSP does have a Career mode these days, and maybe that’s what’s been submitted for the Main Story stat? Even so, I could play this game for literally the rest of my life and I’d still probably find stuff to do.

So SteamLeft won’t be perfect, by a long shot. I’ve got whole sections of my library dedicated to multiplayer games that I’ll never “beat”, and I’ve got a load of duplicate games as well, for when things have a separate beta branch entry in my games list.

However, I wrote myself a little script to scrape my steamleft entry daily, and log the results to a file.

The script is here:

#!/bin/sh
 wget -O - http://steamleft.com/span/76561197973314452 | echo $(date +"%d-%m-%Y") $(xmllint --html --xpath "/html/body/main/div/div/section[1]/div[4]/text()" - 2>/dev/null) >> /home/anorak/steamleft/bob

I’ll break down what it’s doing:

wget -O - http://steamleft.com/span/76561197973314452

Wget grabs whatever content exists at the URL you give it. In this case, the URL is the steamleft page for my steam account. This information is looked up in realtime when you visit the page (presumably).

The “-O -” argument redirect the downloaded contents to standard in, rather than writing the results to disk. This is useful because we don’t need to then look up the contents of disk afterwards, and the next part of the command can read directly from standard input.

| echo $(date +"%d-%m-%Y") $(xmllint --html --xpath "/html/body/main/div/div/section[1]/div[4]/text()" - 2>/dev/null) >> /home/anorak/steamleft/bob

The “|” character is a pipe, and it’s used for directing the output of the command before it to the input of the command after it.

echo $(date +”%d-%m-%Y”)

This outputs the date in the format “dd-mm-yyyy”.

xmllint --html --xpath "/html/body/main/div/div/section[1]/div[4]/text()

xmllint I had to install myself, it wasn’t part of my standard ubuntu install. It’s a program for parsing XML. The “–html” option allows parsing of HTML (HTML is often not XML compliant).

The “–xpath” option  lets me grab the actual element I want from the page.  I found the xpath by looking through the steamleft page in Google Chrome:

steamleftxpath

$(xmllint --html --xpath "/html/body/main/div/div/section[1]/div[4]/text()" - 2>/dev/null) >> /home/anorak/steamleft/bob

The “-” reads the input from standard in.

the “2>/dev/null” redircects all errors to /dev/null. And there WILL be errors, because it’s HTML. And XML parsers do not like HTML very much.

>> /home/anorak/steamleft/bob

This adds the result to the end of the output file.

The output file looks like this:

30-03-2015 1768 continuous hours
31-03-2015 1768 continuous hours

Thrilling.
This script is set to execute once per day. I’ll leave it running for a few months, and see how I’m doing at beating my library. I’ll hook it up to GNUplot at some point too, for shits and giggles.

I’ve been fairly good at not buying new games, with the intention of beating my  back-catalog. This should give me an indication of how I’m doing. And it gave me an interesting little exercise. In fact it took me longer to write up how I did it than it took to actually do it.

 

Half-Life 2: Point Insertion Part II

#ABANDON POINTLESS PROJECT. I lost the screenshots I was taking for this, and then found this post in my drafts folder. I cannot be arsed to continue this.

Christ on a bike, it was 2 years since I last thought about doing anything with this.

his flesh is jelly!

He partially melted

We last left our hero GORDON FREEMAN standing in front of a turnstile. Gordon felt threatened by the irregular gate, and remained motionless for some time.

As Gordon watched, a man walked through the gate. And when I say through the gate, I don’t mean that the turnstile turned around him, as is proper.

I mean we walked THROUGH the gate. He warped through it. He phased through it. He ignored everyday physical laws.

Gordon, being a theoretical physicist, calms down and stops trying to worry so much. He realises that the man in question is in possession of alien technology. Or he is possessed.

On the other side of the gate is the main station arrivals / departures area. It’s got a few people in it to interact with, all of which inform the setting, all of whom I’ll talk about in a moment.

Looking at this room 9 years later is strange. These days, you’d generally expect it to hold a dozen people or more, to make it seem busier, and make the world seem more real. However this can be a detriment in some ways – if you populate a room with that many people, you’d want each one to have at least SOMETHING to say, and it becomes harder and harder to avoid problems such as – NPCs standing around doing nothing. Not interacting, not talking, not going anywhere. Even here, with these few NPCs, that problem is not totally avoided.

Each of the NPCs has a specific task, an action they must perform. They are mere puppets, condemned to play out the same stage directions over and over again, with no variations.

Paranoid Guy

Paranoid Guy looks a little older than most of the people we talk to here. His line delivery is twitch, and furtive; he doesn’t want to be seen talking to you, but still feels obliged to warn you.

According to him, They have been putting something in the water, to sap the precious bodily fluids to make everyone forget. If this is true, and not just the paranoid delusions of an old man, we never get a hint of it. For all Gordon knows, he could have been talking about stuff in the water even before the Combine invasion, and he probably believed in chemtrails and the Illuminate.

However, it wouldn’t really be out of character for the combine to be drugging the population into submissiveness and compliance. If people can’t recall a time before the combine so easily, then they’d be less likely to rebel?

Of course – history (and 1984) shows us that there are much subtler ways to achieve this. Mass propaganda and fear can accomplish much the same thing.

Muttering Guy

Muttering guy paces back and forth in front of the departures/arrivals board

Petulant Guy

Guys Soon To be Dragged Away By Secret Police and Shot