Automating and sending speedtest.net data to web services

Recently I got frustrated by a series of broadband service failures. I realized they were difficult to diagnose both by me and my service provider (who, by the way, was very helpful) because it was difficult to determine when exactly they occurred and whether the issue was with the broadband connection or my wireless router. This weekend, inspired by this Make: Magazine feature, I hooked up a Raspberry Pi to my broadband router, set it up to periodically query speedtest.net (using speedtest-cli) and log the results.

I’m not a particular fan of IFTTT, which I find too linear and limiting (not to mention a certain arrogance towards third-party content providers) and thus I looked for alternative ways to post my speedtest results to an online place where I could obsessively check them whenever I’m out of the house. I liked this post describing how to use the same speedtest-cli with Loggly instead of IFTTT. But of course I wasn’t satisfied with hashing together a bunch of perl one-liners, so instead I found this script to manipulate speedtest-cli output, and modified it so it could log results to a CSV file, post them to IFTTT, Loggly or to any URL that would accept JSON, such as Zapier:

./speedtest-extras.sh [-d] [-c] [-h] [-i secret-key] [-l]
    -d: debugging-mode (reuses previously logged speedtest result instead of queriying speedtest - faster)
    -c: CSV mode
    -h: Print CSV header (only if used together with the -c flag)
    -i: IFTTT mode. Takes an IFTTT Maker Channel secret key as argument (required)
    -l: Loggly mode. Takes a Loggly Customer Token as argument (required)
    -j: JSON mode. Posts the result as a JSON document to any URL passed as argument (required)

My modified command-line interface to speedtest.net is available on GitHub, where I’ve also posted a few usage examples. Here, I will concentrate on how to use it to post to Zapier.

How to automatically send speedtest results to Zapier

First, take care of dependencies. My script makes use of speedtest-cli, which in turn is written in Python. Assuming you’ve got a working install of Python, you can use your favourite package manager to get hold of speedtest-cli:

$ pip install speedtest-cli

Then download my code, either as a ZIP archive or by using git:

$ git clone https://github.com/timtomch/speedtest-cli-extras.git

Once you have downloaded my repository, navigate to the bin folder1 that’s inside it:

$ cd speedtest-cli-extras
$ cd bin

Then you can try running my script in CSV mode to make sure everything is working properly:

$ ./speedtest-extras.sh -c
2016-03-29 02:33:38 UTC;2016-03-29 02:34:19 UTC;Start Communications;XXX.XXX.XX.XXX;SoftLayer Technologies, Inc. (Toronto, ON);8.53 km;17.794 ms;23.97 Mbit/s;1.95 Mbit/s;http://www.speedtest.net/result/XXXXXXXX.png

Depending on the speed of your Internet connection, it should take about a minute to run the test. If you see output similar to the above, things are working.

It is now time to setup Zapier to receive your data. If you haven’t got an account yet, go ahead and create one (the free plan should work just fine). Then click the bright red “Make a Zap” button to get started.

Using the search box, choose “Webhooks by Zapier” as your trigger, then select the “Catch Hook” option. Leave the next screen (options) empty and click Next until you reach a screen that should look like this:

Screenshot of the Zapier interface showing which URL to send JSON data to.
Setting up a Webhook on Zapier.

Zapier will issue a custom webhook URL to trigger your events. Copy that URL to the clipboard.

Now run

$ ./speedtest-extras.sh -j <PASTE YOUR ZAPIER URL HERE>

and wait again for the prompt to reappear. If nothing else shows up on your Terminal it’s a good sign. Go back to your browser and click the blue “OK, I did this” button. After a short while, Zapier should display a nice green message saying the test was successful. Go ahead and click on the “view your hook” link to check what data was sent to Zapier. You should see something like this:

Screenshot of the Zapier interface, showing data submitted via a JSON Webhook.
Testing the Zapier Webhook to ensure the JSON data was properly received.

Then you can decide what to do with that data. I chose to have each event add a new line to a Google Spreadsheet:

Screenshot of the Zapier interface, showing options to set up a Google Spreadsheets app.
Setting up Zapier to add rows to a Google Spreadsheet.

Go ahead and test your setup, then save your Zap once you are happy with the results. Don’t forget to turn on your Zap.

Now, every time you fire

$ ./speedtest-extras.sh -j <PASTE YOUR ZAPIER URL HERE>

Zapier will execute the operation you specified (add a row to a Google Spreadsheet in my example). Now, if you had to manually run the script to get a measurement, that would defeat the whole purpose, so the last step is to add a cron job so the script is run automatically:

$ crontab -e

This lets you edit your crontab. To run a speed test every hour, add the following line to it:

0 * * * * ./absolute/path/to/speedtest-extras.sh -j <YOUR ZAPIER URL>

Note that you need to specify the whole path to the speedtest-extras.sh script in your crontab for it to work.

Now watch the data slowly pile up, and start drafting that email to your broadband provider.

Next step: full Raspberry Pi tutorial?

A recent conversation with a friend facing the same issue made me think I could also write up a short tutorial on how to replicate my Raspberry Pi speed tester setup from scratch. Anything to avoid working on more useful things, like getting ahead on my MLIS research or freshening up my resume for this position I’m considering applying to…

  1. This directory structure is not entirely necessary but is a leftover from the original speedtest-cli-extras which I forked.

Link dump 2016/4: NoSQL, Mars littering, dams in disgrace and the lost pictures of Antarctica

Generating TLS/SSL certificates with Let’s Encrypt on hosted websites

At the OLA Super Conference Hackfest last week, I chose to work on trying to enable TLS/SSL encryption on library websites using Let’s Encrypt. Digital privacy has been a hot topic of discussion in libraries lately, most prominently around the efforts of the Library Freedom Project. Enabling TLS/SSL (HTTPS) encryption on library websites and online catalogue is among the first steps recommended by the LITA Patron Privacy Technologies Interest Group to protect the privacy of patrons.

Unfortunately, TLS/SSL encryption is a fairly complex (and costly) process. While large institutions often manage their own web servers and have complete control over them, many small public libraries resort to web hosting services. Some of these services will arrange for HTTPS certificates at a premium, but the process is often not straightforward. Enter Let’s Encrypt, which opened as a public beta service last December. As a “free, automated, and open certificate authority (CA)”, Let’s Encrypt aims to enable anyone who owns a domain name to receive a trusted certificate at no cost, and more easily than by using commercial certificate authorities.

With the helpful support of Dan Scott, we spent the morning trying to install and run Let’s Encrypt, generate certificates and enable them on our own personal websites and, for one brave librarian, on her live public library page.

Installing

The first step was to download and run Let’s Encrypt. We all had Mac laptops and so the following may only apply to Mac OS X. The source code for generating certificates is distributed on GitHub and the following steps were required to make it work, following this tutorial.

  1. Install git. We all had different setups on our laptops. Using Homebrew to download and install git as well as the other required packages worked well. As for me, I found out that recently upgrading the OS to El Capitan broke my Xcode toolchain, including git. This fix found on Stackexchange worked well to restore the Xcode command-line tools without having to go the trouble of downloading the whole Xcode package:
    $ xcode-select --install
  2. Download (git clone) Let’s Encrypt from GitHub:
    $ git clone https://github.com/letsencrypt/letsencrypt
    $ cd letsencrypt
  3. Installing any remaining dependencies. It turns out that letsencrypt-auto checks for missing dependencies every time it is run. So running the following will install all that is needed using Homebrew (as well as install Homebrew itself if not already present).
    $ ./letsencrypt-auto --help

    Since it checks dependencies at every run and will potentially want to install missing ones in /usr/local and /usr/bin, Let’s Encrypt will request root access every time it is run. Not only is this a bit unexpected and unsettling, because root access should not be required to generate certificates that will not be applied locally, but it will cause other problems when retrieving the generated certificate, as we will see below.

Generating certificates

Once Let’s Encrypt was up and running on our laptops, the next step was to generate the certificates. By default, letsencrypt-auto assumes it is run on the same server that is hosting the website, but in our case we wanted to set up encryption on hosted domains. For this, we need to tell Let’s Encrypt to only generate the certificates, which we would then upload to our hosting providers. This is done using the certonly option.

The syntax we used was

./letsencrypt-auto certonly -a manual --rsa-key-size 4096 -d domain.com -d www.domain.com

Note that this will generate two distinct certificates, one for domain.com (without www) and one for www.domain.com. Again, this will ask for root password, as discussed above.

The console turns into a quaint pseudo-interactive mode to ask for an email address, request acceptance of the terms of use and warn that the IP address of the machine requesting the certificate will be logged:

letsencrypt-auto dialog box

The next step is important to ensure that the user requesting a new certificate for a particular domain has legitimate claims to that domain. To this end, Let’s Encrypt will generate a hash string that needs to be copied to a specific file on the server that hosts the domain in question. The hash, the name of the file and the path are provided:

Make sure your web server displays the following content at
http://domain.com/.well-known/acme-challenge/weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS before continuing:

weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS.Rso39djaklj3sdlkjckxmsne3a

If you don't have HTTP server configured, you can run the following
command on the target server (as root):

mkdir -p /tmp/letsencrypt/public_html/.well-known/acme-challenge
cd /tmp/letsencrypt/public_html
printf "%s" weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS.Rso39djaklj3sdlkjckxmsne3a > .well-known/acme-challenge/weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS

This can be copied over to the web server either using FTP, a web based file manager, or by opening a SSH connection to the server. If the SSH route is chosen, then the commands provided on lines 9-11 above can be used to generate the challenge file. Note that the hash and filename provided above are examples. Use the actual text provided by Let’s Encrypt.

screenshot of cpanel file manager

screenshot of cpanel file editor
Uploading the challenge file on the server using the cpanel file editor.

Once the file is live on the website, hit enter to finish the process. Let’s Encrypt will visit the newly created page on our website, check that it’s ours and generate the certificates. If more than one domain were specified (e.g. with and without www), the process is repeated for each domain.

The generated key and files are saved in /etc/letsencrypt/live/domain.com. However since Let’s Encrypt was run as root (see above), these files belong to the root user and trying to display them returns a Permission denied error. To display them, use sudo:

$ sudo ls -l /etc/letsencrypt/live/domain.com
total 32
lrwxr-xr-x  1 root  wheel  33 Feb  5 22:03 cert.pem -> ../../archive/domain.com/cert1.pem
lrwxr-xr-x  1 root  wheel  34 Feb  5 22:03 chain.pem -> ../../archive/domain.com/chain1.pem
lrwxr-xr-x  1 root  wheel  38 Feb  5 22:03 fullchain.pem -> ../../archive/domain.com/fullchain1.pem
lrwxr-xr-x  1 root  wheel  36 Feb  5 22:03 privkey.pem -> ../../archive/domain.com/privkey1.pem

Note that if you specified more than one domain when running letsencrypt-auto , LetsEncrypt will generate a single certificate covering all specified domains. It will appear on /etc/letsencrypt/live under the name of the first domain that you specified.

As can be seen, the files stored in /etc/letsencrypt/live are actually symlinks to files stored in /etc/letsencrypt/archive. Using sudo every time we want to access those files is bothersome, so we can use chown  to change their owner back to us:

$ sudo chown -R username /etc/letsencrypt/

Update: Enabling TLS/SSL on hosted websites

The next and final step is to copy the content of the generated keys in the SSL setup interface on our web host. Unfortunately, it turned out that neither of us had this functionality turned on by our service providers and we ended up having to write them to request enabling SSL.

I have asked my provider to enable SSL on my domain, and after a week or so they wrote back saying that they had not only done the necessary configuration changes to allow me to upload my own certificates, but they had implemented a cpanel extension allowing their customers to generate their own certificates without going through the hassle described above! This is excellent news, and I shall soon try it out for my other domains, but for now I wanted to try loading the certificates I had generated. Here’s how I did it (my hosting admin interface uses cpanel 54.0.15).

The previous version of this post linked to this post (in German, which provides a screenshot of a different configuration interface.

Under Security -> SSL/TLS, I chose the “Install and Manage SSL for your site (HTTPS)” option, which installs certificate and key in one single step. I then selected my main domain name in the drop-down menu, and copied the contents of cert.pem  in the field labeled CRT and privkey.pem  to the KEY field. I left the CABUNDLE field blank and hit “Install Certificate”.

On Mac OS X, piping anything to pbcopy  will place it in the Clipboard, ready to be pasted anywhere. So this is how I copied the contents of the certificate file before pasting it on the cpanel form:

$ cat cert1.pem | pbcopy

And that was it! I got a nice confirmation message that the certificate was installed, detailing all domain names it was covering. It also helpfully listed other domain names I have pointing to my site but not covered by the certificate, warning me that using them will cause browsers to raise a security issue.

I’m using WordPress to manage this blog, and it needed to be reconfigured so as to serve all inserted images as HTTPS, and thus avoiding mixed content issues: Setting the main URL of the WordPress install to HTTPS.

Once this was done, my browser started rewarding me with a nice padlock icon on all my pages, confirming that I had successfully enabled HTTPS on my domain! I also ran it through an SSL checker for good measure.

Screenshot of the Chrome navigator security message, confirming timtom.ch is secure.
The security details as displayed by Chrome. Yay green lock!

Conclusion

The availability of Let’s Encrypt as a free and open alternative to commercial certificate authorities is an important step towards a more secure Internet. However, the current beta version of Let’s Encrypt still requires some familiarity with command-line interfaces, web development tools and an understanding of how TLS/SSL works. Better documentation and a more user-friendly interface will certainly go a long way in making the process easier. The necessity to run the client as root is another barrier that will hopefully be lifted as the software evolves. Finally, even though generating certificates is now freely accessible, setting them up on hosted websites still require the service providers to activate this option.

There is an extension for cpanel that claims to allow end-users to easily setup Let’s Encrypt certificates on their website. Maybe as demand grows, hosting providers will begin enabling this extension for their customers and HTTPS will then truly become an easy option for everyone to use. Since my hosting provider recently enabled this option, I plan to try it out soon and will report back if I do.

Thanks a lot to Dan and the OLA Super Conference Hackathon organisers and facilitators, as well as the other attendees with whom I worked on this project! I certainly learned a lot.

Link dump 2016/1: Modernist libraries, fiction publishing, podcasts and Noah Webster

While I keep working on the draft of my first actual blog posts, let’s see if I can also use this space to keep track of what I recently enjoyed reading:

This week, I also learned that most of American English spelling can be traced to Noah Webster. He axed the extra u’s in colour and neighbour, changed offence to offense and cheque to businesslike check. He’s the one who insisted the letter “z” be pronounced “zee” instead of “zed” (he also wanted “y” to be called “yi” and “w” to become “we”). All this, and much more, from the first chapter of Mary Norris‘ Between You & Me, which is a true delight to read1.

  1. Nonrestrictive clause