Building the Debian Handbook

What follows is instructions for creating a local HTML copy of the Debian Administrator’s Handbook (which is a very useful source of information for anyone working with any Debian derivative including Ubuntu and Raspian). All work related to this project was done on a Raspberry Pi Zero running Raspian, so I suspect it will work on anything running any Debian derivative (although Ubuntu 16.04 is the only other system I’ve tested this on so far).

Open up a terminal, and issue the following commands to get hold of the source code:

sudo apt install git
sudo git clone
git://anonscm.debian.org/debian-handbook/debian-handbook.git

Install the packages required for building:

sudo apt install publican publican-debian

Build the html files:

cd debian-handbook/
sudo ./build/build-html

It might take a while to build, especially on the sort of hardware I’ve been using. This might be the point to make a cup of tea.

Copy the HTML files into the root of your web server:

sudo cp -R publish/en-US/Debian/8/html/debian-handbook/ /var/www/html/

At this point you should be able to browse to the home page of the directory by navigating to the hostname or IP address of your web server.

Getting up and running with a CHIP

Tonight I finally received two CHIP boards (sort of a cross between a Raspberry Pi and a Pi Zero). I’d kickstarted these about a year ago and totally forgotten about it, so it was a nice surprise. Whenever I get my hands on something like this the first challenge is to power it up, boot an operating system, and see what it will do.

What follows is one way to get one of these devices powered up, connected to a wifi network, and with access to a graphical desktop. These instructions will work on macOS and Linux, for Windows there may be a need to consult the manual to get the relevant type of terminal access.

The only thing you’ll need (apart from the CHIP itself) is a microUSB cable. As an avid Raspberry Pi enthusiast I have quite a few of these lying about so there was no additional expense. Plug the small end of the cable into the relevant slot on the CHIP and the other end into a spare USB port on your computer. You’ll then need to see what device name your computer has assigned your CHIP by issuing the following command in a terminal window:

ls /dev/tty*

Find the output that looks something like /dev/tty.usbmodemFD1223 and make a note of it. Then issue the following command (replacing my device name with whatever yours is):

screen /dev/tty.usbmodemFD1223 115200

At that point you should get a login prompt. Log in as user chip with password chip (yes, I know). At that point you should find yourself logged into a fairly minimal Debian installation.

As yet there is no network, but as the CHIP has wifi then we can set this up fairly easily. In the logged in terminal session enter the following:

sudo nmcli device wifi connect '(your wifi network name/SSID)' password '(your wifi password)' ifname wlan0

The output should be something like:

Connection with UUID 'e9e45ce8-9961-4116-a7eb-d526e60af3ee' created and activated on device 'wlan0'

At this point you should have a network connection. Test it by doing some software updates:

sudo apt-get update && sudo apt-get upgrade

When you’re done (it might take a while) install xrdp to allow you to initiate remote desktop connections to the CHIP:

sudo apt-get install xrdp

Once that is done, create a new RDP connection using your client of choice. Find out the IP address using ifconfig or just use the name chip.local, enter the username and password, and you should see a graphical desktop with an application menu and a fair few applications.

I’ve also had some success plugging an ethernet adaptor into the CHIP’s USB port and connecting via ssh, but on most occasions the device powered down before I could do anything useful with it. This is the same setup I use with my Raspberry Pi Zero, so I know it theoretically works, but I need to investigate how much power the adaptor is drawing as it looks like the device is struggling to power it.

An updated guide to using Pandoc for document conversion

I wrote about Pandoc last year, but I’m using it more and more and I’ve found myself editing the original post a fair few times. This is the updated 2016 version that gathers together useful commands I’ve learned so far.

Last year I found myself needing to do a lot of document conversion, and maintaining documentation that needs to be available in a variety of formats (HTML, Word documents, Markdown and PDF). My tool of choice for this sort of thing is Pandoc, which is available for Windows, Mac OS X and Linux, although most of my usage so far has been on Linux and Mac OS X (it’s a command line package that can output to Dropbox, so it doesn’t matter where it runs really).

There are instructions for installing Pandoc on quite a few platforms. I’ve found that following these is generally enough, although it’s worth installing the latest version of the .deb packages rather than the one in the repositories.

On Debian/Ubuntu I also add the texlive-latex-extra package, but that’s largely because it gives me a specific Beamer theme I like to use.

If you’re using Pandoc on Mac OS X there is one more command you’ll need to issue prior to the first time you want to create a PDF file:

sudo ln -s /Library/TeX/texbin/pdflatex /usr/local/bin/

This will ensure Pandoc knows where to find pdflatex. If this step isn’t followed then you’ll likely get an error message along the lines of pandoc: pdflatex not found. pdflatex is needed for pdf output.

Pandoc works for me because I write everything in markdown, and Pandoc is great at taking markdown and converting it into almost anything else. It’s also good if you need to create a PDF, a Word document and a slide show from the same document. The syntax is fairly simple for most document types:

For example:

pandoc input.md -s -o output.docx
pandoc input.md -s -o output.html
pandoc input.md -s -o output.epub

Conversion to PDF works the same, although I’m not a fan of wide margins, so I tweak it slightly:

pandoc -V geometry:margin=1in input.md -s -o output.pdf

For a Beamer slide show you’ll need something like:

pandoc -t beamer input.md -V theme:metropolis -o output.pdf

Pandoc does a lot more, but the documentation is great, and the commands above should be enough to get you started. If you want to try out the functionality in a web browser then http://pandoc.org/try/ should be able to handle most types of conversions.

Setting up new Ubuntu machines

I’ve had to set up a few Ubuntu desktop machines recently (for my own use), and I thought it was worth documenting what I install on each one, and how I automate those installations as much as possible.

Add a script to make updating software easier

Create a new file called updateall

#!/bin/bash
sudo apt-get update
sudo apt-get upgrade -y
sudo apt-get dist-upgrade -y
sudo apt-get clean -y
sudo apt-get autoclean -y
sudo apt-get autoremove -y
sudo purge-old-kernels -y

Move it to /usr/local/bin/ then make it executable with sudo chmod -X updateall.

Add some software from the Ubuntu repositories

updateall
sudo apt-get install git gimp byobu vlc ubuntu-restricted-extras build-essential hexchat openssh-server unity-tweak-tool youtube-dl

Install Spotify

sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys BBEBDCB318AD50EC6865090613B00F1FD2C19886
echo deb http://repository.spotify.com stable non-free | sudo tee /etc/apt/sources.list.d/spotify.list
sudo apt-get update
sudo apt-get install spotify-client

Install atom

cd Downloads
wget https://github.com/atom/atom/releases/download/v1.7.3/atom-amd64.deb
sudo dpkg -i atom-amd64.deb

The version number may be different – get the latest version from https://github.com/atom/atom/releases/.

Install dropbox

wget https://www.dropbox.com/download?dl=packages/ubuntu/dropbox_2015.10.28_amd64.deb
sudo dpkg -i dropbox_2015.10.28_amd64.deb

The version number will be different, but hit tab after typing dropbox and it should autocomplete. If that doesn’t work, download the latest version from https://www.dropbox.com/install?os=lnx.

Install tails-installer

wget --continue http://dl.amnesia.boum.org/tails/stable/tails-i386-2.3/tails-i386-2.3.iso
sudo add-apt-repository ppa:tails-team/tails-installer
sudo apt update
sudo apt install tails-installer   

Install pandoc

wget https://github.com/jgm/pandoc/releases/download/1.17.0.2/pandoc-1.17.0.2-1-amd64.deb
sudo dpkg -i pandoc-1.17.0.2-1-amd64.deb
sudo apt-get install texlive

See here for more on how I configure and use Pandoc, and also for a fix for a Mac OS X related bug to do with rendering PDFs.

Cosmetic tweaks

Go to System Settings --> Appearance
Change theme from Ambience –> Radiance
Reduce Launcher size to 24
Change desktop wallpaper
Enable workspaces

Open Unity Tweak Tool and configure so that hot corners work, with the top left and right corner doing a window spread (largely because that’s how my Macs are set up, and also how Gnome 3 works).

Go to System Settings --> Security and Privacy and turn off all “phone home” functionality.

Building a media centre with a Raspberry Pi and OpenELEC

My project for the Easter vacation has been to build a media player using a Raspberry Pi and Open ELEC. Setup was fairly straightforward, but I thought it was worth writing up anyway – especially as I’m probably going to make further changes to the setup as I find new features to add.

Hardware

I went with the new Raspberry Pi III, which is plenty powerful enough for this project. I also used a 16Gb SD card (the largest unused one I currently have), and a case that looked like it would handle being jostled around in my bag. The device also requires power and HDMI cables (which I already had), and a keyboard/mouse/monitor/ethernet cable for setup.

Software

OpenELEC is one of the installation options on the NOOBS image, so I simply downloaded that, copied it to the SD card, and installed it from there. It requires a network connection to install, but is a lot easier than having to copy the image using dd. I went with the default options in all cases, although it’s worth noting that if you enabled ssh access then it’s not possible to change the root password at all, so you’ll need to disable it after setup (not that setup, or anything else, requires shell access).

Once installation had finished the device booted into the default Kodi interface. A web-based remote could be accessed by browsing to the device’s IP address, and it could be accessed as network based storage from all of my computers. Then it was simply a case of dropping some media files (movies and music) into the respective folders and testing that content could be played. I copied across some MP3, MP4 and AVI files, all of which played fine.

Addons

The original plan for this project was that I’d end up with something that could play movies and music on my TV, and that could handle storing a small amount of content locally so that when I end up in a hotel room with a few hours to kill I have something interesting to watch. The solution I’ve built ticks all those boxes, but I was curious to explore what else OpenELEC could handle.

After exploring the interface and available software for a little while I found channels for Last.fm scrobbles, BBC iPlayer and TED talks. All of these installed and worked fine. Adding iPlayer started me thinking about other free to view TV channels, and at this point I remembered that the last edition of Linux Format had an article on using a Pi Mini for a similar project, and that there were instructions for adding a whole host of other services. Their instructions for adding ITV player were as follows:

Navigate to System > File Manager. Select ‘Add Source’ followed by ‘’, enter http://www.xunitytalk.me/xfinity and select ‘Done’ followed by ‘OK’. Hit Esc then choose System > Settings > Add-ons > Install from ZIP file. Select xfinity from the list of locations, select ‘XunityTalk_Repository.zip’, hit Enter and wait for it to be installed. Now select ‘Install from repository’ followed by .XunityTalk Repository > Video add-ons, scroll down and select ITV. Choose Install and it should quickly download, install and enable itself.

This worked fine, and also gave me access to a lot of other channels that I could add.

There are a lot of things I’ve not explored on this device yet, but at the time of writing I’ve got BBC and ITV channels (live and catchup), TED talks, and a variety of locally stored media. Music I play scrobbles to Last.fm, and I can drop new media onto the device from my computer. I figure that all I’ll need to travel with is a power cable, a HDMI cable and a small mouse (all of which I already have), and I should be sorted. I also tested a trick I’ve used before which involves sharing a wifi connection via ethernet on my laptop to get the two devices to talk to each other long enough to add/remove media, which might also prove useful.

Updating Ubuntu

I’ve been using Ubuntu a fair bit over the last few weeks, both at work and at home. I have a number of projects on the go, and I’ve found myself needing to maintain a few different machines (both LTS and current) to run experiments on, and to built live servers and services.

One thing I’m very big on is keeping software up to date, and I thought it was worth sharing the script I use to update my Ubuntu machines (and then to delete anything that is no longer required, like old kernels).

#!/bin/bash
sudo apt-get update
sudo apt-get upgrade -y
sudo apt-get dist-upgrade -y
sudo apt-get clean -y
sudo apt-get autoclean -y
sudo apt-get autoremove -y

I create this on each machine (in a file called updateall), move it to /usr/local/bin/ and then make it executable with sudo chmod -X update all. I then run it when I want to update software (or even better, add it as a scheduled job so it runs once a day).

If I end up with many more machines I’ll need to find something more elegant, but for now this will suffice.

Building a budget computer

I’ve been meaning to set up a low-powered Linux machine for a while, but developing a new Ubuntu-based service at work made me realise that having something at home to experiment with would be useful. I wanted something with real hardware, but also something that wouldn’t use too much power or cost me too much money.

After a bit of research, I settled on a Gigabyte Brix BXBT-2807, which is a bare bones solution that requires a hard drive, memory, and an OS to complete. Amazon says that this model now costs £94.98, although with my Prime Now discount and another voucher it cost me just over £60. I chose this model because it’s got a USB3 port (as well as 2 USB 2 ports), and it outputs to both HDMI and VGA meaning I can use it with both my existing monitor and my TV. Size-wise it’s a roughly square black box that doesn’t look big enough to be a real computer, and which takes up about the same space as a Mac Mini (being much narrower but slightly taller).

I decided to make this machine as powerful as it could be, just in case I ever needed to use it to do anything more taxing than web development and a little light browsing. I already had a 128gb SSD (which would have added about £40 to the cost), but neither of the sticks of memory I had were suitable (one was too higher voltage, the other was only 2Gb and I wanted more than that). I ended up buying an 8gb stick for around £30, which maxes out this particular case as it only has one memory slot.

Assembly was straightforward, and just required a phillips screwdriver. Once I’d fitted the hard drive and memory I connected the computer to my existing monitor, plugged in a keyboard and mouse and booted it from an Ubuntu installation USB. It booted from the USB fine, and installation didn’t take too long at all. I went with 14.04 LTS because it’s what the machine at work is running, and I do enough software updates on my other machines without having something else that was on the bleeding edge.

All in all this machine is working well (and very well for the price). I needed to add a bluetooth adaptor to get my solar powered keyboard working (but I carry a couple of these with me anyway), and this computer seems incapable of connecting to a 5MHz wireless network, but these are the only two things that are sub-optimal, and are easily fixed with a bluetooth adaptor and an ethernet cable. I’m also very impressed with how fast this machine is, and even how quickly it will perform processor-intensive tasks like ripping DVDs.

So far I’ve set up a minimal Plex server on it, plus a LAMP development environment and the tools for making Ubuntu live USB installers. I’ve also used it for a couple of days for email/web browsing, and didn’t really notice that I was on a much less powerful machine.

I’m very pleased with how quick this was to set up, and it’s good to see that it’s possible to have a fully functional computer for under £150.

A few words about IT literacy

When I’m recruiting new IT staff one of the things I always look for is how computer literate they are. It’s a hard thing to work out, as it’s usually a mixture of what they know and what they have done in the past, but also how they think and how inspired they are by technology. I also try and think back 10 years, to when I was the person on the other side of the desk who was trying to blag that a whole load of dabbling with things at home was enough experience to allow me to support some fairly important systems in a large University.

I sometimes get asked what advice I’d give someone wanting to get into an entry level IT role when they don’t have any experience. I sometimes think that’s the wrong question, because everyone has IT experience, and also the opportunity to gain experience without leaving the comfort of their own bedroom. I thought it might be worth expanding on what I mean by that, and what sort of things would impress me if I saw them on an application form or heard them in an interview. I’d also say that this list is probably a good starting point for anyone who wants to learn more about IT in general.

Use more than one operating system (Windows, Mac OS X, Linux) and learn the skills common to all of them

There is a viewpoint that Microsoft have won the OS war, and that IT professionals should concentrate on familiarising themselves with Windows and MS Office because that is what everyone uses. I don’t share that viewpoint, but I do think that it’s important to use the software that other people are using, because if you want to be able to support that software then you need to know how it works. I think it’s essential to have an overview of all the main operating systems, and I’m particularly interested in people who run more than one, or who have changed their primary operating system and can articulate their reasons why. It suggests they have thought about what they want their computer to do, and that they have considered the financial, ethical and functional criteria that contribute to the decision as to what OS to use.

For instance, my current main OS is Mac OS X. I started to switch from Ubuntu at the end of 2010 in order to better understand an OS I was being asked to implement and be an advocate for in my workplace. The switch took a few months, but by mid 2011 all of my regularly used machines were Macs. I do however maintain machines running Ubuntu and Debian, and am now doing more Linux based work which may warrant a partial switch back at some point. I like using Macs because of the quality of the hardware and software, and that fact that everything generally just works. I dislike them because of the lack of freedom, and the number of decisions about how I use my computer that seem to have been taken away from me. I like using Linux because I can customise my computer to do exactly what I need it to do at no cost to myself or my employer, but I dislike the fact it requires a lot of maintenance, and also that I can’t use some software I require to do my job and therefore need to also maintain a Windows machine or a Mac anyway. I also still maintain that the 11″ Macbook Air is the best computer ever made, and until I find something better then I want to continue using one.

What I find about using multiple operating systems (and I’d include Windows in this) is that once you use more than one, you realise they all have things in common, and once you start to spot those patterns then it makes it easier to deal with unfamiliar operating systems. Windows 8 doesn’t faze me in the slightest because I remember the Mac OS9 –> OS X shift, and also the move from Gnome 2 to Gnome 3 (and Unity, and a load of other desktops). The key for me is getting to a point where the desktop doesn’t get in the way of being productive, and that comes through regular use.

As an aside, I’ve switched my main OS a few times, and also maintained two in parallel for quite a while. I was a (classic) Mac OS user until my Mac became too old, and then had a brief (maybe a year) period of using mainly Windows. I switched to Debian in late 2004, and then Ubuntu from 2005. I got another (refurbished) Mac in 2006 and maintained OS X and Ubuntu in parallel until 2009 where I found I was doing everything in Ubuntu and hardly ever turned my Mac on (to be fair, it was very old at this point). I then switched back to OS X in 2011 as detailed above.

What I’ve noticed is that people who have only ever used one OS are often scared of all the other ones, and the easiest way to get over that is to experiment with them. Linux is free, and will install on almost anything, and if you’re in the UK then you can pick up a decent refurbished Thinkpad from around £200 from http://www.refreshedbyus.com/, or a budget desktop without an OS from http://www.ebuyer.com/ for around the same price. Windows machines are also coming down in price every year, and it’s now affordable to maintain more than one machine in ways that it wasn’t 10 or even 5 years ago. And of course virtualisation is now easier than ever (but I’ll mention more about this later on).

Use more than one version of each operating system (or at least know how to use them)

Something else I’ve noticed (especially with people who grew up with Windows XP) is that it’s not just trying another OS that is scary, but moving to a new version of the same OS. It’s certainly worth being familiar with the last couple of versions of anything you’re using and supporting, and having an overview of what the upgrade path would be for someone using something obsolete and unsupported like Windows XP.

I also think that if you’re running (or experimenting with) Linux, then it’s worth trying out at least a couple of desktop environments to see what works for you (and for your computer). I’ve got machines running Gnome 3 (Debian), XFCE (Debian & Xubuntu) and Unity (Ubuntu). None of them are perfect, but all of them allow me to understand the similarities and differences of modern desktop operating systems.

If you use several different operating systems it becomes really easy to see how the user interfaces and features of one will influence another. And once you start to make those connections then it’s fairly straightforward to approach a new operating system or desktop environment and make it work well enough for you to help someone who is having difficulties with it.

Use at least two browsers

If you’re supporting software, then you’re likely supporting browser-based software, and knowing how that software behaves in all of the main web browsers is something you need to be up to speed with. I find the best way to do that is to use at least three browsers regularly, and for me that means Firefox and/or Chrome on my computers, and Safari for my iOS devices. I test everything on all three, and on other browsers as well (although if I’m asked to test things it’s usually because they have only been tested on Windows and someone wants the non-Windows perspective).

As with operating systems, if you use multiple browsers then you are unlikely to be surprised or significantly slowed down when a new browser grabs a decent slice of the market share like Chrome did a few years back. It also make it easier to switch your main browser if the one you’re using start to get slow and bloated, or no longer includes features that you really need.

Install a virtualisation tool and set up a new VM

I said I’d come back to this one, because I think it deserves a section to itself. Virtualisation software has been such a game-changer for me, because it has allowed me to continue using multiple operating systems without having to maintain a physical computer for each one. By using software such as https://www.virtualbox.org/ it’s possible to run multiple operating systems on the same machine, and also to set up virtual web servers to experiment with blogging software, wikis, and other CMS related things. I’m currently doing a lot of this sort of thing at work, and it’s great to be able to have virtual servers that are backed up and snapshotted so I can roll them back to the point just before I broke something. Once you’ve developed like this then you’ll never go back, and it will teach you all sorts of skills that are directly applicable to sysadmin work, as well as development and IT support.

Virtualisation is also great for those situations where you can do 90% of your work in one OS, but need to switch to another one for one or two specific tasks. The guest machine is only using resources when it’s on, and you may find that most of the time you don’t even need to boot it.

Know how to back up your data, and where all copies of your data are

I blog quite regularly about how I back up data, but it’s always worth writing about, as I find that things change as I stumble upon new products. My current plan is based on the 3/2/1 rule, with three copies of everything, on two types of media, with (at least) one remote copy. I use Time Machine, Crashplan and Carbon Copy Cloner to back up copies of my whole computer, and Dropbox, Google Drive and iCloud to ensure that files I use regularly are available on any computer I use.

How it generally works is that any machine that stays in one place (or mainly stays in one place like my heaviest laptop) backs up nightly (via Carbon Copy Cloner) to an external hard drive. I also have a portable hard drive that I back up to weekly with a bootable copy of the two machines where I regularly create data (as opposed to consume it). When I’m not backing up to it, this drive is kept in a different physical location to the machines it is backing up. Additionally, all my music is in iTunes Match, my photos are on two different NAS drives, all my portable computers back up to another machine via Crashplan and/or Time Machine, and everything text based I’m currently working on will exist in either Evernote, Dropbox or Google Drive, depending on what it is and who else needs to access it.

I’ve also started running some experiments with Bit Torrent Sync – maintaining a small directory of emergency music and freely available ebooks which I sync between all of my machines, and I also carry around an encrypted USB drive on my keyring which contains a lot of the same sort of stuff, as well as the installer for the latest version of Mac OS X, plus recent disk images of Ubuntu and Debian.

I test my backups monthly (sometimes more than monthly), including booting all the full disk clones to make sure they actually boot. I think this is important. I also try and replace my backup drives every couple of years to ensure that I’m not backing up to something that is likely to fail soon.

Know how to upgrade/replace key parts of your computer

This is something I think is so important, but it seems to be a dying art. Not that I’m surprised though, because Apple (and to a lesser extent other manufacturers) seem to be moving towards a world where individual parts of a computer are not upgradeable, and instead you just buy a new computer when it wears out or gets slow. So many older computers could benefit from a solid state hard drive (SSD) or some more memory, and both of these upgrades will make an old computer feel like a new one. There are plenty of people who will fit parts for you, but this will cost you, and often these are upgrades you can do yourself. Since I’ve been working with technology I’ve upgraded most of my machines (even my Macbook Air), but I do worry that the next computer I buy is likely to be less upgradable than the last.

I learned about computer hardware through buying an old machine from eBay and experimenting with it. I replaced the memory, and the power supply and the hard drive, and I’ve still got it sat in the shed 10 years later. There are still plenty of machines out there that you can replace pretty much everything in, and building a PC from scratch is still very much a rite of passage for anyone who is interested in hardware.

Know how to reinstall the OS on your computer

Long gone are the days where operating systems would not be upgraded for years. We’re now in a world where things change at least every 6-12 months, and it’s important that the operating system on your computer is up to date and receiving security updates regularly. Updating software is relatively straightforward on any computer, and we do seem to be moving towards the concept of an app store, where the OS is just another app to be upgraded when a new version comes out. Whatever you’re running, it’s a good idea to know how to upgrade the software on your computer, and also how to reinstall it from scratch. These are things that you can pay someone to do, but you never know where and when computer faults will happen, and the night before a deadline or while you’re overseas are not good times to learn about reinstalling operating systems.

Use more than one office suite, and learn the skills common to all of them

A big part of IT support is knowing about what the people you support are actually using. Arcane terminal commands and knowledge of compiling software will get you nowhere if you are supporting people who largely work with documents, spreadsheets and presentations. Particularly in a corporate or academic environment, knowing about a variety of office suites will serve you well, and it’s important to stay up to date so that you’re not surprised by changes to user interfaces. This is one area of IT that can be tricky to stay up to date with if you don’t use this software yourself, and as someone who writes in a text editor, and only really uses Word for specific work-based tasks, I’m probably not the best person to advise on it. Although the fact that I use Keynote for presentations and Excel for serious data manipulation does suggest I can at least use some of more than one office suite. I also like Libre Office a lot, and think it’s one of the most underrated pieces of software out there.

As with operating systems and browsers, there is so much feature-bleed with office software that once you have used a couple of different versions then you start to see how they all do roughly the same thing under the hood. This is also a class of software where manufacturers love to change the UI radically between versions, so be prepared to relearn menus over and over again. Of course, if you use keyboard shortcuts then there should be less learning to do.

Which brings me nicely on to keyboard shortcuts.

Learn keyboard shortcuts

On my main desktop computer I have a solar powered keyboard, which means that even in the cloudy climate of the UK I can pretty much guarantee that it will work. The same can’t be said of my wireless mouse, which is always running out of power and needing newly charged batteries. That doesn’t bother me as much as it might do though, as I’m fairly keyboard-shortcut-literate, and can do most of what I need to do without picking up the mouse. Not only can knowing these get you out of a fix if your mouse or trackpad stops working, but it’s a lot quicker to open or save a file using the keyboard when your hands are already touching the keyboard to type. It’s also a lot better on your wrists, and will make you look like you know what you are doing with your computer. It’s one thing I always look for when I’m trying to judge how computer-literate someone is, and it is usually a very good indicator.

A list of keyboard shortcuts for Mac OS X can be found at https://support.apple.com/en-gb/HT201236. Some of these will work on other operating systems, but I’m sure there are similar lists elsewhere (Ubuntu even has one on the screen the first time you launch the Unity desktop).

Host a website

In the days of Facebook, Twitter, Tumblr and a thousand other readily available web-based content sites, it’s rare to find someone to doesn’t have some sort of web presence. When I started out with computers it was harder to get content online, and I had to learn a fair bit of HTML just to have a simple home page, whereas now I can just create an account online in a few minutes. Despite the fact that it’s so easy, I still think it’s valuable to know how the nuts and bolts work, and how to set up your own website that you host and control yourself. My first site was hand crafted HTML, and my current website is a self-hosted WordPress blog (cloud hosted now, but originally hosted on a server under the desk in my office).

I think it’s still valuable to know how to configure a web server (I use LAMP – Linux, Apache, MySQL and PHP), and install a CMS like WordPress on it. Even if you don’t use it for your main blog it is something you might be asked to do one day, and it’s a skills set that I’ve found myself using over and over again (and is in fact something I’m working on professionally right now).

Learn a programming language (or two)

I’m not a programmer, but I do know a little bit of HTML, CSS and PHP. Programming languages are not required for IT support, but as programming is largely about problem solving then there are a lot of transferable skills. Programming is also useful for solving in-house problems that your support tools can’t do (like writing a password generator or something to convert proprietary mailbox formats to something more open – both requirements I’ve come across in my own team).

Learning some basic scripting is also a good idea, and a familiarity with shell scripting and Windows powershell scripts is never going to be wasted time and effort.

Know what you can do and what you can’t do

And finally, this. It’s all very well to look and sound impressive by stretching your IT skills and knowledge to the extreme, but it’s important to be honest with yourself about what you can and can’t do, and which of your theories are backed up by practical experience. Experiments are all very well in the comfort of your own home, but when you’re dealing with other people’s computers and data then ensure you know what you are doing and when to ask for help and guidance.

10 things beginning with S

There is a meme going round where people are asked to name 10 things beginning with a certain letter that mean something to them. I was given the letter S, and will attempt to come up with 10 things that are not people I know ( I could probably add another 10 if I include people, and I’m always wary of missing people out).

1. Spotify – It’s revolutionised the way I listen to music, and also the way I discover music to a certain extent. Before Spotify I always had to own things I like, and now I find I am perfectly content with streaming music. I also like the fact that I can save songs to my phone/iPad for offline listening. Not that I’m ever offline for long, but the functionality is very useful for those times when I am.

2. Sennheiser – Both sets of headphones I use at the moment were made by Sennheiser, and I suppose I have a certain amount of brand loyalty. I particularly like times when I’m able to wear my huge over-ear headphones, but am increasingly finding that they are too bulky for anything but listening to music in the house.

3. Substance – Two great compilation albums from the 80s – one by New Order and one by Joy Division. Between them they contain some of the best music I’ve ever heard, and tell the story of one of the most interesting (and tragic) musical transitions.

4. Sunrise – I like the sun. I like it being light and being sunny. The sun energises me, and the rising of the sun (which I see all too often) is something that I really like to watch and occasionally photograph. Sunrise is also the name of the calendar app I use on my Mac and iPad to keep my life organised, as well as being the name of a New Order song I really like.

5. Slackware – Not the first Linux distro I installed myself (that honour goes to Debian), but the one that lead to me learning a lot about how computers work when I was cramming IT knowledge into my brain about 10 years ago.

6. Sonic Youth – A band that have been with me since I was at school. They were probably my introduction to American guitar music, but also to the avant-garde. I could probably also say the same about Swans, who I discovered around the same time.

7. Summer – My favourite season, and the season I was born in.

8. Sideways – I’m astrologically Cancerian, which I supposed to mean I’m good at moving forwards whilst appearing to move sideways. I can relate to that, and  very much see it as a viable way to move through life.

9. Study – I like to study new things, and have really never stopped learning new things since I left University. I also have a physical study now – it’s the smallest front bedroom in our house, and it allows me to have somewhere to read, write, listen to music, and otherwise interact with things in an environment that I can totally control.

10. Sleep – I’m one of those strange people who doesn’t really enjoy sleep. It’s not that I sleep badly, or have particularly horrible dreams, I would just rather be awake, upright, and doing something.

Backing up Gmail with Gmvault

This weekend I have been experimenting with Gmvault (http://gmvault.org/) in order to back up my various Gmail and Google Apps accounts to my computer. I’m using Mac OS X, but almost all of this will work with Linux too.

Firstly I downloaded the software, extracted it, and ran the following command once to download all of the mail in each account:

./gmvault synch example1@gmail.com

I then wrote a script to automate the process:

#!/bin/bash
#change to the correct directory
cd /Applications/gmvault/bin/
#run a quick sync on all my gmail and google apps accounts
./gmvault sync -m -t quick example1@teknostatik.org
./gmvault sync -m -t quick example1@gmail.com
./gmvault sync -m -t quick example2@gmail.com
./gmvault sync -m -t quick example3@gmail.com

Once that was working, I automated it with cron to run a few times a day.

Restoring the email to another Gmail account is a slow process, and you should probably only do it a few times a month (and always overnight). Again I’ve scripted this bit, but have commented everything out unless I actually need it. Having done the initial upload, and because I now have two local copies of everything, I’ll probably only run this one monthly.

#back up all downloaded email to a dedicated gmail account in the cloud
#./gmvault restore backup_account@gmail.com
#or just for the last month
#./gmvault restore -t quick backup_account@gmail.com

Further reading: