Everything tagged homeautomation (3 posts)

Avoiding Catastrophe by Automating OPNSense Backups

tl;dr: a Backups API exists for OPNSense. opnsense-autobackup uses it to make daily backups for you.

A few months ago I set up OPNSense on my home network, to act as a firewall and router. So far it's been great, with a ton of benefits over the eero mesh system I was replacing - static DHCP assignments, pretty local host names via Unbound DNS, greatly increased visibility and monitoring possibilities, and of course manifold security options.

However, it's also become a victim of its own success. It's now so central to the network that if it were to fail, most of the network would go down with it. The firewall rules, VLAN configurations, DNS setup, DHCP etc are all very useful and very endemic - if they go away most of my network services go down: internet access, home automation, NAS, cameras, more.

OPNSense lets you download a backup via the UI; sometimes I remember to do that before making a sketchy change, but I have once wiped out the box without a recent backup, and ended up spending several hours getting things back up again. That was before really embracing things like local DNS and static DHCP assignments, which I now have a bunch of automation and configuration reliant on.

OPNSense has a built-in way to have backups be automatically created and uploaded to a Google Drive folder. Per the docs it does this on a daily basis, uploading a new backup to Google Drive if something changed. If you want to use Google Drive for your backup storage, this is probably the right option for you, but if you want to customize how this works - either the schedule on which backups are made, or where they're sent, there are ways to do that too.

OPNSense Google Drive backups configuration
Use the built-in Google Drive backup feature if that makes more sense for you

Using the OPNSense API to create backups

OPNSense provides a simple API that allows you to download the current configuration as an XML file. It gives you the same XML file that you get when you click the "Download configuration" button manually in the OPNSense UI. It's worth downloading it manually once and just skimming through the file in your editor - it's nicely organized and interesting to peruse.

Once you've done that, though, you'll probably want to automate the process so you don't have to remember. That's fairly straightforward:

Setting up OPNSense for API backups

We need to set up a way to access the OPNSense backups API, ideally not using our root user - or indeed any user with more access privileges than necessary to create backups. To accomplish this we'll set up a new Group called backups - create the Group via the OPNSense UI, then edit it to assign the Diagnostics: Configuration History privilege. This grants access to the /api/core/backup/ APIs.

OPNSense Assign Backups privilege

Then, create a new User called backup, and add it to the new backups Group. Your Group configuration will end up looking something like this:

OPNSense Add Backups Group

Now that you have a new backup User, which has access only to configuration/backups APIs, you need to generate an API Key and Secret. Do this in the UI (your actual key will be a long random string):

OPNSense Create User Key & Secret

Creating an API Key for the user will automatically initiate a download in your browser of a text file containing 2 lines - the key itself and a secret. This is the one and only time you will be able to gain access to the secret, so save it somewhere. An encrypted version of it will be kept in OPNSense, but you'll never be able to get hold of the non-encrypted version again if you lose it. Here's what the text file will look like:

key=SUPER+TOP+SECRET+KEY
secret=alongstringofrandomlettersandnumbers
key=SUPER+TOP+SECRET+KEY
secret=alongstringofrandomlettersandnumbers

Downloading a backup via the API

Let's test out our new user with a curl command to download the current configuration. The -k tells curl to disregard the fact that OPNSense is likely to respond with an SSL certificate curl doesn't recognize (for your home network you are unlikely to care too much about this). The -u sends our new user's API Key and Secret using HTTP Basic auth:

$ curl -k -u "SUPER+TOP+SECRET+KEY":"alongstringofrandomlettersandnumbers" \
https://firewall.local/api/core/backup/download/this > backup

$ ls -lh
total 120
-rw-r--r-- 1 ed staff 56K May 24 09:33 backup
$ curl -k -u "SUPER+TOP+SECRET+KEY":"alongstringofrandomlettersandnumbers" \
https://firewall.local/api/core/backup/download/this > backup

$ ls -lh
total 120
-rw-r--r-- 1 ed staff 56K May 24 09:33 backup

Cool - we have a 56kb file called backup, which ends up looking something like this:

<?xml version="1.0"?>
<opnsense>
<theme>opnsense</theme>
<sysctl>
<item>
<descr>Increase UFS read-ahead speeds to match the state of hard drives and NCQ.</descr>
<tunable>vfs.read_max</tunable>
<value>default</value>
</item>
<item>
<descr>Set the ephemeral port range to be lower.</descr>
<tunable>net.inet.ip.portrange.first</tunable>
<value>default</value>
</item>
<item>
<descr>Drop packets to closed TCP ports without returning a RST</descr>
<tunable>net.inet.tcp.blackhole</tunable>
<value>default</value>

... 1000 more lines of this ...

</opnsense>
<?xml version="1.0"?>
<opnsense>
<theme>opnsense</theme>
<sysctl>
<item>
<descr>Increase UFS read-ahead speeds to match the state of hard drives and NCQ.</descr>
<tunable>vfs.read_max</tunable>
<value>default</value>
</item>
<item>
<descr>Set the ephemeral port range to be lower.</descr>
<tunable>net.inet.ip.portrange.first</tunable>
<value>default</value>
</item>
<item>
<descr>Drop packets to closed TCP ports without returning a RST</descr>
<tunable>net.inet.tcp.blackhole</tunable>
<value>default</value>

... 1000 more lines of this ...

</opnsense>

In my case I have a couple of thousand lines of this stuff - you may have more or less. Obviously, we wouldn't usually want to do this via a curl command, especially not one that resulted in our access credentials finding their way into our command line history, so let's make this a little bit better.

Automating it all

There are a variety of options here, on 2 main axes:

  • Where to send your backups
  • How often to make a backup

In my case I want to put the file into a git repository, along with other network configuration files. OPNSense does have a built-in way to back up files to a git repo, but I want to be able to put more than just OPNSense config files in this repo, so I went for a more extensible approach.

Daily backups seem reasonable here, as well as the option to create them ad-hoc. Ideally one would just run a single script and a timestamped backup would appear in a backups repo. As I recently set up TrueNAS scale on my local network, this seemed a great place to host a schedulable Docker image, so that's what I did.

The Docker image in question handles downloading the backups and pushing them to a GitHub repository. This approach allows us to easily schedule and manage our backups using TrueNAS SCALE, or anywhere else on the network you can run a docker container. It's published as edspencer/opnsense-autobackup on Docker Hub, and the source code is up at https://github.com/edspencer/opnsense-autobackup.

OPNSense autobackup logo
Behold the generative AI logo. Don't look too closely at the letters

Setting Up the Docker Container on TrueNAS SCALE

Here’s a quick walkthrough on how to set up the Docker container on TrueNAS SCALE and configure it to automate your OPNSense backups.

OPNSense Auto Backup docker image running on TrueNAS Scale
We can afford the 172kb of memory used to run opnsense-autobackup

Prerequisites

  1. Docker Installed on TrueNAS SCALE: Ensure that Docker is installed and running on your TrueNAS SCALE system.
  2. GitHub Repository: Create a GitHub repository to store your backups.
  3. GitHub Personal Access Token: Generate a GitHub personal access token with repo read/write permissions to allow the Docker container to push to your repository.

Generate a GitHub Personal Access Token

  1. Go to GitHub Settings.
  2. Click on Generate new token.
  3. Give your token a descriptive name and give it read and write permissions for your new backups repository
  4. Click Generate token.
  5. Copy the token and save it securely. You will need it to configure the Docker container.

Set Up the Docker Container on TrueNAS SCALE

Navigate to the Apps screen on the TrueNAS Scale instance, then click Discover Apps followed by Custom App. Give your app a name and set it to use the edspencer/opnsense-autobackup docker image, using the latest tag.

You'll need to provide the following environment variables, so configure those now in the Container Environment Variables section:

NameValue
API_KEYyour_opnsense_api_key
API_SECRETyour_opnsense_api_secret
HOSTNAMEfirewall.local
GIT_REPO_URLhttps://github.com/your_username/your_repo.git
GIT_USERNAMEyour_git_username
GIT_EMAILyour_git_email
GIT_TOKENyour_git_token
CRON_SCHEDULE0 0 * * *

Set the CRON_SCHEDULE to anything you like - this one will make it run every day at midnight UTC. Click Install to finish, and you should see the app up and running. So long as you have created your GitHub repo and PAT, you should already see your first backup files in your repo. Depending on what you set for your CRON_SCHEDULE, you'll see new files automatically appearing as long as the image is running.

OPNSense backups in the GitHub repo
A screenshot of my own OPNSense backups repo, with backups populating automatically

And you should see some Docker log output like this:

2024-05-25 09:58:05.362503-07:00CRON_SCHEDULE provided: 0 * * * *. Setting up cron job...
2024-05-25 09:58:07.707058-07:00Starting cron service...
2024-05-25 09:58:07.707137-07:00Starting backup process...
2024-05-25 09:58:07.708367-07:00Cloning the repository...
2024-05-25 09:58:07.710068-07:00Cloning into '/repo'...
2024-05-25 09:58:08.339297-07:00Downloading backup...
2024-05-25 09:58:08.343397-07:00% Total % Received % Xferd Average Speed Time Time Time Current
2024-05-25 09:58:08.343461-07:00Dload Upload Total Spent Left Speed
2024-05-25 09:58:08.379857-07:000 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 57117 100 57117 0 0 1521k 0 --:--:-- --:--:-- --:--:-- 1549k
2024-05-25 09:58:08.381179-07:00Saving backup as latest.xml and opnsense_2024-05-25_16-58.xml...
2024-05-25 09:58:08.391197-07:00[main 7922900] Backups generated 2024-05-25_16-58
2024-05-25 09:58:08.391785-07:001 file changed, 1650 insertions(+)
2024-05-25 09:58:08.391814-07:00create mode 100644 opnsense_2024-05-25_16-58.xml
2024-05-25 09:58:09.087436-07:00To https://github.com/edspencer/opnsense-backups.git
2024-05-25 09:58:09.087476-07:00bce0d8a..7922900 main -> main
2024-05-25 09:58:09.090436-07:00Backup process completed.
2024-05-25 09:58:05.362503-07:00CRON_SCHEDULE provided: 0 * * * *. Setting up cron job...
2024-05-25 09:58:07.707058-07:00Starting cron service...
2024-05-25 09:58:07.707137-07:00Starting backup process...
2024-05-25 09:58:07.708367-07:00Cloning the repository...
2024-05-25 09:58:07.710068-07:00Cloning into '/repo'...
2024-05-25 09:58:08.339297-07:00Downloading backup...
2024-05-25 09:58:08.343397-07:00% Total % Received % Xferd Average Speed Time Time Time Current
2024-05-25 09:58:08.343461-07:00Dload Upload Total Spent Left Speed
2024-05-25 09:58:08.379857-07:000 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 57117 100 57117 0 0 1521k 0 --:--:-- --:--:-- --:--:-- 1549k
2024-05-25 09:58:08.381179-07:00Saving backup as latest.xml and opnsense_2024-05-25_16-58.xml...
2024-05-25 09:58:08.391197-07:00[main 7922900] Backups generated 2024-05-25_16-58
2024-05-25 09:58:08.391785-07:001 file changed, 1650 insertions(+)
2024-05-25 09:58:08.391814-07:00create mode 100644 opnsense_2024-05-25_16-58.xml
2024-05-25 09:58:09.087436-07:00To https://github.com/edspencer/opnsense-backups.git
2024-05-25 09:58:09.087476-07:00bce0d8a..7922900 main -> main
2024-05-25 09:58:09.090436-07:00Backup process completed.

Conclusions and Improvements

I feel much safer knowing that OPNSense is now being continually backed up. There are a bunch of other heavily-configured devices on my network that I would like centralized daily backups for - Home Assistant and my managed switch configs being the obvious ones. More to come on those.

Obviously you could run this anywhere, not just in TrueNAS, but I like the simplicity, observability and resource reuse of using the TrueNAS installation I already set up. So far that's working out well, though it use some monitoring and alerting in case it stops working.

For a detailed guide on setting up the Docker container and automating your backups, visit the GitHub repository. The script that actually gets run is super simple, and easily adaptable to your own needs.

Continue reading

Automating a Home Theater with Home Assistant

Theater of Screams
Disclaimer: Theater enjoyment may be dimished by using it to watch Manchester United

I built out a home theater in my house a couple of years ago, in what used to be a bedroom. From the moment it became functional, we started spending most evenings in there, and got into the rhythm of how to turn it all on:

  • Turn on the receiver, make sure it's on the right channel
  • Turn on the ceiling fan at the right setting
  • Turn on the lights just right
  • Turn on the projector, but not too soon or it will have issues
  • Turn on the Apple TV, which we consume most of our content on

Not crazy difficult, but there is a little dance to perform here. Turning on the projector too soon will make it never be able to talk to the receiver for some reason (probably to prevent me from downloading a car), so it has to be delayed the right number of seconds otherwise you have to go through a lengthy power cycle to get it to work.

It also involves the location of no fewer than 5 remote controls, 3 of which use infrared. The receiver is hidden away in a closet, so you had to go in there to turn that on, remote control or no. Let's see if we can automate this so you can turn the whole thing on with a single button.

IR is not a good solution

The first thing I tried were these IR Repeaters, which I figured would allow me to keep the receiver remote in the theater and not have to go into the closet. I tried a few different models but they were all super weak for some reason, despite being plugged in, to the extent that you need to position the re-emitter within inches of the device's IR sensor. I couldn't achieve that in a way that wasn't ugly with wires hanging everywhere, so I gave up on that idea.

Then I tried these Bestcon IR Blaster things, which in theory allow you to record remote control buttons and repeat them. The IR Blaster can join your network, which means it can be automated using Home Assistant, which I use extensively around the house already. I planned to place one of these in the theater itself (for the projector) and another in the closet (for the receiver).

This kinda worked, but it was a bit of a pain to program and they just weren't reliably triggering the devices. Significantly more than zero percent of the time the signal didn't get through, and as with the IR repeaters, you end up with more wires hanging around as it still relies on line-of-sight to the IR sensor. It's also another moving part, something to go wrong, and another random device on your network so it seems this has more downsides than up.

Finally, the coup de grace was that the IR Blaster doesn't know what state your devices are in (bad if you're trying to turn on your device, it's already on, and now you just toggled it off), nor does it know if the command it tried to send was received or needs to be re-tried. There must be a better way...

TCP/IP to the rescue

It turns out all of the devices I wanted to control were network-connectable. In the case of the receiver (a Denon AVR-X6700H) and the projector (a Sony VPL-VW325ES), there's an ethernet port that lets you plug it directly into your network.

Sony Projector Web UI
The Sony Projector web UI is basic but informative and with some basic setup functions

Both of these devices actually expose a little HTTP server hosting a basic web app. These allow you to both power on/off the device, and do things like change input. The receiver, pleasingly enough, actually publishes a pretty complete API, which allows you to do basically anything you could do with the remote in your hand, including advanced configuration. Awesome.

Denon Receiver Web UI
The Denon web UI is rich - you can do a lot with it. Nice to have for sure.

The documentation is extensive, though rather dense. Contained therein is the fact that we can send commands like ZMON and SIBD to the receiver, which will turn it on and Switch Input to the Bluray Disk input respectively. As well as the web UI, the receiver exposes a way to send those odd little commands over HTTP - in this case we can just send a GET request to http://receiver.local:8080//goform/formiPhoneAppDirect.xml?ZMON, which will turn on the receiver's main zone. Swap ZMON for whatever command you want to run. There's no actual iPhoneApp involved here, but I guess from this url that one exists.

As we'll see in a moment, that's all we need to know to get Home Assistant able to control both of these devices.

Home Assistant Preparation

Home Assistant already has built-in support for controlling the Sony Projector, so now that it has an IP on our network we can just tell Home Assistant where to find the Projector. As per the docs, this requires a manual edit to configuration.json, which is unusual but easy enough.

There are several ways to edit that file, the easiest probably being to use the File Editor addon. Again per the docs, this just means adding lines 12-15 to your configuration.yaml file (replace projector.local with the IP of the projector if you don't have fancy local DNS wizardry running):

Home Assistant Projector Switch

Either restart home assistant or reload the YAML so it can pick that up. Now your projector shows up as a persistent switch in home assistant, so you can turn it on/off at will either via the home assistant UI or via scripts and other automations.

To get Home Assistant able to talk to the receiver, I had to install the Denon AVR integration. That's pretty easy and gives you a pretty basic device page for the receiver, where you can turn it on/off but not much else:

Home Assistant Receiver Device

But it also gives 3 addition services you can call in your automations, one of which is the all-important Denon AVR Network Receivers: Get command.

The Script

At this point the script is pretty easy. In order, we:

  • Use that iPhoneAppDirect.xml path to send the ZMON command (Zone Main On) to the receiver
  • Turn on the fan (using the Bond integration we fixed last time)
  • Set the correct lighting scene (all Philips Hue fixtures and LED strips in this case)
  • Wait the right number of seconds so the projector can talk to the receiver properly
  • Call the receiver again to switch to the XBox input (SIBD)
  • Call the Switch: turn on service on the Projector entity that we added to configuration.yaml

Home Assistant Theater On Script

We switch to the XBox input first in case we're going to watch a Bluray, otherwise we just press any button on the Apple TV remote to wake that device up and the receiver automatically switches to it. There is also a Theater Off script, which basically does the opposite of the above.

An occasionally useful feature is that we can now turn the theater on and off remotely. As the projector does take a couple of minutes to warm up it can be nice to turn it all on with one button on my phone and then waddle over there a couple of minutes later to find everything ready.

Triggering with a light switch

As I had switched all of the lights in the room to be various Philips Hue fixtures and light strips, the 2-gang light switch box by the entry door suddenly became redundant as all of the lights were permanently powered. This gave a delightful opportunity to install on and off switches in their stead:

Physical Switches to turn the theater on and off
The fact these things are a little beat up is just proof of how useful they are...

These two switches are not connected to any power source, but to a Philips Hue Wall Switch Module, which is just a simple battery-powered device that detects when you flip the switch and exposes that event to the rest of the Hue ecosystem. Because Hue integrates well with Home Assistant, that means we can trivially use it as a trigger for our automations.

The Hue wall module approach works well for this, even though it's not really what it's designed for. All it does is track when a switch is flipped - it doesn't know whether it's on or off, doesn't stop you from flipping it several times (though Home Assistant can dedupe that if necessary), and some day the battery will need to be replaced, but it's served as an excellent solution for us. It also means guests don't have to figure out how to turn everything on/off correctly - just flip the switch.

Possible Extensions & Limitations

Home Assistant can also integrate more deeply with XBox and Apple Tv. In the case of XBox, this requires you to switch it into a much more power-hungry standby mode, which would have the device consuming 30 watts in standby. That's a huge amount of power to spend to basically just enable HA to turn it on and off, so I passed on that.

Similarly, HA can integrate more deeply with Apple TV - loading content as well as just turning it on/off. But, as we use a variety of different apps, the integration wouldn't have much of a chance of knowing which one we're going to choose, so while there's no real power consumption downside, it just wouldn't be useful in our case.

Continue reading

How to make Bond fans work better with Home Assistant

I have a bunch of these nice Minka Aire fans in my house:

They're nice to look at and, crucially, silent when running (so long as the screws are nice and tight). They also have some smart home capabilities using the Smart by Bond stack. This gives us a way to integrate our fans with things like Alexa, Google Home and, in my case, Home Assistant.

Connecting Bond with Home Assistant

In order to connect anything to these fans you need a Bond WIFI bridge. This is going to act as the bridge between your fans and your network. Once you've got it set up and connected to your wifi network, you'll need to figure out what IP it is on. You can send a curl request to the device to get the Access Token that you will need to be able to add it to Home Assistant:

curl command

If you get an access denied error, it's probably because the Bond bridge needs a proof of ownership signal. The easiest way to do that is to just power off the bridge and power it on again - run the curl again within 10 minutes of the bridge coming up and you'll get your token.

Integrating Bond with Home Assistant is then pretty easy - search for the Bond integration at http://homeassistant.local:8123/config/integrations/dashboard (substitute for your Home Assistant domain if different) and install, providing the IP and Token you have for your Bond bridge:

add Bond integration

It will populate your fans - here's an example, the fan in my home theater:

add Bond integration

The top 2 controls there in theory control the fan and the fan's light.

The Annoying Light Toggle bug

Sometimes the light on the fan gets turned on and is impossible to turn off. Whether you use the remote control, the Bond app or Home Assistant, no force in the known universe will turn the fan light off. It's really annoying when it happens. The only way to fix it is to turn the fan off and on again at the breaker, after which it will start responding to commands again.

It also seems to be implemented as a memory-less toggle in some contexts, and a dimmable light in others, and Bond/Home Assistant don't necessarily know the current true state of the light. The Bond app even has a settings page called "Fix Tracked State", where you can go to manually override what Bond thinks is the current light state, assuming it has wandered out of sync. However, even after toggling this in all the ways I could, the bug persisted and it still needed a visit to the breaker box.

Bond app fix state

One annoying way this bug manifests itself is that the fan lights will turn on when I run my "All Lights Off" script on Home Assistant - this script calls the light.turn_off service on all of the lights set up in the following areas of the house. Curiously, this turns ON the fan lights. I guess that's just because Bond doesn't know if the light is on or off, so it just tries to toggle.

Bond app fix state

Given that one of these fans is in the bedroom and I press a button that runs the script when we're ready to sleep, it's a little unfortunate that the "All Lights Off" script ends up turning on a bright fan light. Doubly so when I have to walk to the garage to power cycle the breaker to be able to turn the light off again. We need a solution here.

Home Assistant - disable the entity

In my case, as I'm using Home Assistant for basically all of the automations in the house, and there is never a time when I want to turn the fan light on, I just disabled it in Home Assistant. There are a few ways to do this but one is to use the Entites View in Home Assistant and search for "fan". Click the light for fan in question (the row with the lightbulb icon), then the cog in the top right of the modal window and uncheck the "Enabled" flag:

Bond app fix state

Now back on your device page the control for the fan light will have disappeared and it'll tell you that an entity is not being shown. Now, calls to light.turn_off won't target the fan's light, and therefore won't turn it on when you don't want it to. Your scripts can still turn the fan itself on/off and set the speed though.

Bond app fix state

Although we lose the ability to control the fan light by doing this, that's not why I have the fans so I don't really care. We have other lighting in the rooms with those fans, so the fan light is never used. Their value is in their silence and prettiness. It's awesome that they integrate with things like Home Assistant. Hopefully this helps out others who have run into similar problems.

Continue reading