Self-hosting is a big hobby for me (and a big source of content for my website). Not only is it a great source of entertainment and fun, but I find it incredibly interesting, vaguely relevant to my job, and a good way to regain a little privacy. Over the last…
Software updates are a critical part of using any kind of electronic device nowadays, particularly if it's internet connected, and even more so if it has any security functionality. If you have any kind of computer online, whether it be a phone, laptop or a server, you should really be keeping it up to date at a sensible rate.
As is a surprise to no one reading this, I have quite a few electronic devices like this, particularly servers. At the time of writing, I have 14 servers (including VMs and LXCs), which vary from running a few applications, to being a routing gateway to my home server, to running DNS and DHCP for my home, and every security use case in between. When I started out self-hosting, I wasn't very proactive on updating. Nowadays, I'm hosting a lot more stuff, and a lot more sensitive information, not to mention that the threat of attacks is much higher than it was when I started out.
So, a few years ago, I started a routine: On Mondays, I update servers.
I update my servers manually, one at a time.
I update my servers manually, by looking at my Ansible inventory,
ssh-ing into each and running the relevant package manager. 14 servers is a little bit of a middleground. On the one hand, 14 is a lot of servers to run something manually on, but on the other it's not like it would take that long.
Updating system packages in bulk is easy -
apt update && apt upgrade are all very simple to run. However, the same doesn't exist for Docker. Tools exist like watchtower, which I used for a while, but I've been burned by it too many times, and its main selling point is the fact it's automated, which I don't want.
To get around this, I wrote my own script to loop through my
docker-compose.yml files, check them for updates, and restart the ones which need it. I originally wrote a custom tool to handle this, but turns out
docker-compose up -d will only restart containers if they need it, including base container changes, environment variables etc, even if the base update was
pull-ed by a previous container update (which happens quite a lot for containers like
But, that only covers updates where the tag stays the same. The
docker-compose.yml files for projects I care about define specific versions. If the projects rebuild the container but don't update the application (eg LSIO), I get those updates, without updating the application. To update the application itself, I use Renovate against my infrastructure repo. Renovate looks at my containers, finds the version numbers, and monitors their upstream for updates, opening PRs as it finds them.
Sure, I could automate it, with something like Ansible, but there's something quite therapeutic and interesting about watching the packages update on each server, seeing what's changed, and enjoying watching progress bars move. I need that kind of excitement on a Monday morning!
Alongside automating updates, I could schedule them too. To some (eg windows users), installing updates is considered a laborious chore, and they're not wrong. Some people would much rather updates "just happen" in the background, and when they shut down their machine, it updates. This would probably work fine for an OS platform like Windows and macOS, but it's not quite comparable here. My servers are comprised of many different packages, installed in a few different ways, very few of which have been guaranteed to work together. In the same way as a desktop OS, I want to spend my waking time making use of the server, not trying to fix it (unless it was me who broke it intentionally).
I used to run at least my Docker updates on a schedule with
watchtower, but not any more. The biggest problem for this is being around and available should something go wrong. Nothing I run is remotely critical - the most critical service I run is probably this website or my Nextcloud, but if something did happen and go down or have an issue, I'd like to be online and looking at a screen, as opposed to it just happening in the middle of a meeting, or worse the middle of the night.
The problem with updates is you need to remember to do them. I always do updates on a Monday morning, before I start work. When asked about how I do updates, I'm often asked how I remember consistently, and honestly I don't have an answer - I just do. I'm a creature of habit, and I've been able to form this habit enough that I remember. It might be that I enjoy knowing my software is up-to-date and watching progress bars run across, but whatever it is it's clearly worked for me. In the beginning, there were a few times I forgot, but it's not taken long.
Well, why not?
You might have been expecting a smarter answer about the chances of a package update being made available over the weekend because many maintainers do the work on weekends as they have other day jobs, but it's just not that smart.
Monday mornings are usually pretty quiet for me. It's rare I need to start something early at work, so it's consistently free. On a Monday, I'm still tired from having to get up early after the weekend, so I don't want to do anything too mentally taxing, so running some scripts matches my energy levels nicely. Other weekdays I'm more likely to be distracted by something else going on, or whatever I was doing the evening before. Weekend schedules also vary massively, so committing to a fixed day and time gets quite difficult (damn pesky social life!).
2023-12-24: Tomorrow is Monday, but as it's Christmas, I did the updates early, for what I hope are obvious reasons.
#How's it working out?
For me, it's working just fine. Today, just before posting this article, I updated my servers, and they're clearly all still here! I update regularly and consistently, and that's the goal.
Updating only on Monday's isn't always sensible. If there's a major security update, I'll go through relevant services and update them, but that's quite rare. Similarly, I frequently tinker with my server, which sometimes includes more-involved package updates - and I definitely don't limit myself to only tinkering on Mondays!
It's a routine which works for me, and I think it's here to stay.
PS. On Wednesdays I don't wear pink
View all →
At some point, servers need to be put on the public internet. Whether that be a VPS in the cloud, or your new homelab. Once a server is on the internet, it’s subject to anything and everything the internet has to offer, from botnets to hackers and script kiddies. It’s…
Today is world backup day, a day to highlight the importance of backups, protecting data, and keeping systems secure (at least that's what Wikipedia says). I'm taking this day as a chance to review my backup strategy, and make sure I'm happy with the coverage I'm getting. I mentioned a…