- Another item to file under “obsolete or unfashionable technologies I hold on to”: Is the Relational Database Doomed? Here I am teaching the gospel of SQL when all the cool kids swear by NoSQL.
- Speaking of being a grumpy old guy, why is it that when I look at the amazing images taken by Opportunity’s 12 years on Mars, all I see are track marks and litter on a once pristine planet? This reminds me strongly of Mariner 9 by Kelly Richardson, which I’ve seen displayed at the National Gallery last year.
- My friend Xavier sent me the coolest belated Christmas gift ever, a tablecloth printed with the beautiful 1:50,000 swisstopo map of Western Switzerland. He used a service called Spoonflower to custom-print the fabric. Of course I immediately started thinking of all the cool things I could do, notably a wallpaper with contour lines, e.g. of the Niagara escarpement…
- This is our 4th winter in Toronto and so far it’s been much milder than the previous years. It snowed again last night, but nothing like the snows of yesteryear.
- Keeping on the arctic theme, negatives from Shackleton’s second Antarctica expedition have been found and were finally developed after more than a century. Instagram, 1914 style.
- Is the age of the hydroelectric dam over? Fist this fascinating 99% invisible episode about fish cannons is full of offhand comments that dams are being routinely dismantled in the US and how it’s a good thing for salmon and humans. Then I learn in the New Yorker about crumbling African dams that could soon become “the dam industry’s Chernobyl.” I do hope we have good alternatives for replacing all the lost power generation and off-peak storage. The answer to the latter could lay in part a few kilometers from me in Lake Ontario, where energy storage using underwater compressed air is being tested as I write this.
- On this topic, I keep going back to the superb What can a technologist do about climate science piece published by Bret Victor last November. He lists a few alternative solutions for energy storage, including the slightly crazy idea of using surplus energy to drive a train uphill, then letting it roll back downhill to harvest that energy back.
- Keepalive is a piratebox hidden in a boulder in northern Germany. Lighting a fire on its side with generate enough power to bring the server to life and share PDF survival guides over WiFi. I just hope there is a USB port on the boulder to power up the lost wanderers’ devices (and that they remembered how to start a fire without needing a survival guide)…
- Ever wondered why businesses ask you how likely you’d recommend a product of service to a friend? It informs the Net Promoter Score, a metric that is used to measure customer satisfaction. Publishers are using it too!
- This nice visualisation tool for place names motivated me to start playing with geo data again. The Overpass API allows bulk download of OSM nodes, and Overpass Turbo has a nice interface to try out Overpass queries.
At the OLA Super Conference Hackfest last week, I chose to work on trying to enable TLS/SSL encryption on library websites using Let’s Encrypt. Digital privacy has been a hot topic of discussion in libraries lately, most prominently around the efforts of the Library Freedom Project. Enabling TLS/SSL (HTTPS) encryption on library websites and online catalogue is among the first steps recommended by the LITA Patron Privacy Technologies Interest Group to protect the privacy of patrons.
Unfortunately, TLS/SSL encryption is a fairly complex (and costly) process. While large institutions often manage their own web servers and have complete control over them, many small public libraries resort to web hosting services. Some of these services will arrange for HTTPS certificates at a premium, but the process is often not straightforward. Enter Let’s Encrypt, which opened as a public beta service last December. As a “free, automated, and open certificate authority (CA)”, Let’s Encrypt aims to enable anyone who owns a domain name to receive a trusted certificate at no cost, and more easily than by using commercial certificate authorities.
With the helpful support of Dan Scott, we spent the morning trying to install and run Let’s Encrypt, generate certificates and enable them on our own personal websites and, for one brave librarian, on her live public library page.
The first step was to download and run Let’s Encrypt. We all had Mac laptops and so the following may only apply to Mac OS X. The source code for generating certificates is distributed on GitHub and the following steps were required to make it work, following this tutorial.
- Install git. We all had different setups on our laptops. Using Homebrew to download and install git as well as the other required packages worked well. As for me, I found out that recently upgrading the OS to El Capitan broke my Xcode toolchain, including git. This fix found on Stackexchange worked well to restore the Xcode command-line tools without having to go the trouble of downloading the whole Xcode package:
$ xcode-select --install
- Download (git clone) Let’s Encrypt from GitHub:
$ git clone https://github.com/letsencrypt/letsencrypt $ cd letsencrypt
- Installing any remaining dependencies. It turns out that
letsencrypt-autochecks for missing dependencies every time it is run. So running the following will install all that is needed using Homebrew (as well as install Homebrew itself if not already present).
$ ./letsencrypt-auto --help
Since it checks dependencies at every run and will potentially want to install missing ones in
/usr/bin, Let’s Encrypt will request root access every time it is run. Not only is this a bit unexpected and unsettling, because root access should not be required to generate certificates that will not be applied locally, but it will cause other problems when retrieving the generated certificate, as we will see below.
Once Let’s Encrypt was up and running on our laptops, the next step was to generate the certificates. By default,
letsencrypt-auto assumes it is run on the same server that is hosting the website, but in our case we wanted to set up encryption on hosted domains. For this, we need to tell Let’s Encrypt to only generate the certificates, which we would then upload to our hosting providers. This is done using the
The syntax we used was
./letsencrypt-auto certonly -a manual --rsa-key-size 4096 -d domain.com -d www.domain.com
Note that this will generate two distinct certificates, one for domain.com (without www) and one for www.domain.com. Again, this will ask for root password, as discussed above.
The next step is important to ensure that the user requesting a new certificate for a particular domain has legitimate claims to that domain. To this end, Let’s Encrypt will generate a hash string that needs to be copied to a specific file on the server that hosts the domain in question. The hash, the name of the file and the path are provided:
Make sure your web server displays the following content at http://domain.com/.well-known/acme-challenge/weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS before continuing: weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS.Rso39djaklj3sdlkjckxmsne3a If you don't have HTTP server configured, you can run the following command on the target server (as root): mkdir -p /tmp/letsencrypt/public_html/.well-known/acme-challenge cd /tmp/letsencrypt/public_html printf "%s" weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS.Rso39djaklj3sdlkjckxmsne3a > .well-known/acme-challenge/weoEFKS-aasksdSKCKEIFIXCNKSKQwa3d35ds30_sDKIS
This can be copied over to the web server either using FTP, a web based file manager, or by opening a SSH connection to the server. If the SSH route is chosen, then the commands provided on lines 9-11 above can be used to generate the challenge file. Note that the hash and filename provided above are examples. Use the actual text provided by Let’s Encrypt.
Once the file is live on the website, hit enter to finish the process. Let’s Encrypt will visit the newly created page on our website, check that it’s ours and generate the certificates. If more than one domain were specified (e.g. with and without www), the process is repeated for each domain.
The generated key and files are saved in
/etc/letsencrypt/live/domain.com. However since Let’s Encrypt was run as root (see above), these files belong to the root user and trying to display them returns a Permission denied error. To display them, use
$ sudo ls -l /etc/letsencrypt/live/domain.com total 32 lrwxr-xr-x 1 root wheel 33 Feb 5 22:03 cert.pem -> ../../archive/domain.com/cert1.pem lrwxr-xr-x 1 root wheel 34 Feb 5 22:03 chain.pem -> ../../archive/domain.com/chain1.pem lrwxr-xr-x 1 root wheel 38 Feb 5 22:03 fullchain.pem -> ../../archive/domain.com/fullchain1.pem lrwxr-xr-x 1 root wheel 36 Feb 5 22:03 privkey.pem -> ../../archive/domain.com/privkey1.pem
Note that if you specified more than one domain when running
letsencrypt-auto , LetsEncrypt will generate a single certificate covering all specified domains. It will appear on
/etc/letsencrypt/live under the name of the first domain that you specified.
As can be seen, the files stored in
/etc/letsencrypt/live are actually symlinks to files stored in
sudo every time we want to access those files is bothersome, so we can use
chown to change their owner back to us:
$ sudo chown -R username /etc/letsencrypt/
Update: Enabling TLS/SSL on hosted websites
The next and final step is to copy the content of the generated keys in the SSL setup interface on our web host. Unfortunately, it turned out that neither of us had this functionality turned on by our service providers and we ended up having to write them to request enabling SSL.
I have asked my provider to enable SSL on my domain, and after a week or so they wrote back saying that they had not only done the necessary configuration changes to allow me to upload my own certificates, but they had implemented a cpanel extension allowing their customers to generate their own certificates without going through the hassle described above! This is excellent news, and I shall soon try it out for my other domains, but for now I wanted to try loading the certificates I had generated. Here’s how I did it (my hosting admin interface uses cpanel 54.0.15).
The previous version of this post linked to this post (in German, which provides a screenshot of a different configuration interface.
Under Security -> SSL/TLS, I chose the “Install and Manage SSL for your site (HTTPS)” option, which installs certificate and key in one single step. I then selected my main domain name in the drop-down menu, and copied the contents of
cert.pem in the field labeled CRT and
privkey.pem to the KEY field. I left the CABUNDLE field blank and hit “Install Certificate”.
On Mac OS X, piping anything to
pbcopy will place it in the Clipboard, ready to be pasted anywhere. So this is how I copied the contents of the certificate file before pasting it on the cpanel form:
$ cat cert1.pem | pbcopy
And that was it! I got a nice confirmation message that the certificate was installed, detailing all domain names it was covering. It also helpfully listed other domain names I have pointing to my site but not covered by the certificate, warning me that using them will cause browsers to raise a security issue.
I’m using WordPress to manage this blog, and it needed to be reconfigured so as to serve all inserted images as HTTPS, and thus avoiding mixed content issues:
Once this was done, my browser started rewarding me with a nice padlock icon on all my pages, confirming that I had successfully enabled HTTPS on my domain! I also ran it through an SSL checker for good measure.
The availability of Let’s Encrypt as a free and open alternative to commercial certificate authorities is an important step towards a more secure Internet. However, the current beta version of Let’s Encrypt still requires some familiarity with command-line interfaces, web development tools and an understanding of how TLS/SSL works. Better documentation and a more user-friendly interface will certainly go a long way in making the process easier. The necessity to run the client as root is another barrier that will hopefully be lifted as the software evolves. Finally, even though generating certificates is now freely accessible, setting them up on hosted websites still require the service providers to activate this option.
There is an extension for cpanel that claims to allow end-users to easily setup Let’s Encrypt certificates on their website. Maybe as demand grows, hosting providers will begin enabling this extension for their customers and HTTPS will then truly become an easy option for everyone to use. Since my hosting provider recently enabled this option, I plan to try it out soon and will report back if I do.
Thanks a lot to Dan and the OLA Super Conference Hackathon organisers and facilitators, as well as the other attendees with whom I worked on this project! I certainly learned a lot.
- Several takes on the #DeleteAcademiaEdu thing. First off, let’s not forget that academia.edu is a (for profit) social network, not a repository. In the academic rat’s race, every mean to get one’s work out there is justified, so deleting profiles might be a luxury only established folks can afford. Also, not all researchers have access to an institutional repository. Personally, I kept my (few) publications on my former institution’s IR and only linked them from my academia.edu profile. This is the approach I recommend when asked.
- Thanks to this post on the Duke Library blog also advocating a balanced approach to the academia.edu debate, I learned the existence of an open-source alternative network called VIVO, which has been implemented in many schools and allows for integration with IRs.
- The Last Days of Target (Canadian Business). What brought down the giant in its ill-fated Canada venture? Messy data.
- Alors oui, un nouveau clavier, voilà ce qu’il faut pour sauver l’orthographe française. Je me dépêche d’en parler avant qu’il soit nécessaire d’obtenir l’autorisation des ayants-droit pour insérer un lien hypertexte. Slow clap.
- This fascinating Pinterest board collects photos, floor plans and other artifacts of Toronto’s Carnegie libraries, most of which are still in service today, although their fireplaces are no longer in use.
- Back to the frozen future with these two vehicles designed to handle snow. The steam-powered Xrot 9213 is still able to free up the Bernina line in eastern Switzerland. Meanwhile, we’re not quite sure where the massive Antarctic Snow Cruiser is.
- I’m not done with trains. The NYT has a great piece on the Bailak-Amur-Mainline (BAM) branch of the Transsiberian.
- Speaking of Siberia, thanks to an amazingly cool weather trick, this Finnish village briefly saw itself reflected in the night sky. Better than an X-Files episode.
- These photos are why I’m trapped in Tokyo forever now. A grittier version of the future emerges from subtle animated GIFs.
- Le 3e lieu m’a tuer – une fois de plus, Mlle Salt a le coup de gueule précis. Pendant ce temps, en Seine-et-Marne, les bibliothèques sont vides (et pour cause).
- This knee-jerk response to a WSJ op-ed daring to question the relevance of library schools is a bit beside the point, however.
- Don’t buy $12 chocolate bars from bearded hipsters.
- binder makes iPython notebooks hosted on GitHub interactive. Looks awesome.
In the midst of endless report-writing, I was faced with an interesting challenge at work this week. We are trying to aggregate e-book usage data for the members of our consortium, and we were interested in figuring out how well the French language content is faring compared to the English titles that make the bulk of the collection.
Unfortunately, one of our vendors do not include language data in either their title lists or the usage reports. Before trying to recoup the usage reports with the full e-book metadata I could get from the MARC records, I tried to run the title list through the guess_language library by way of a simple Python script:
from guess_language import guess_language import csv with open('2015-01_ProQuest_titles.csv', 'rb') as csvfile: PQreader = csv.DictReader(csvfile) for row in PQreader: title = row['Title'] language = guess_language(title.decode('utf-8')) print language, title
The results were a disaster:
pt How to Dotcom : A Step by Step Guide to E-Commerce en My Numbers, My Friends : Popular Lectures on Number Theory en Foundations of Differential Calculus en Language and the Internet en Hollywood & Anti-Semitism : A Cultural History, 1880-1941 de Agape, Eros, Gender : Towards a Pauline Sexual Ethic la International Law in Antiquity fr Delinquent-Prone Communities en Modernist Writing & Reactionary Politics
guess_language works by identifying trigrams, combinations of three characters that are more prevalent in one language than another. While it works reasonably well on whole sentences and short text snippets, the particular construction of a book title seems to throw the method entirely off-kilter.
As I was pondering the next steps, I came to realize that I could also filter titles based on language directly in the vendor database and then export to a CSV file… which solved my issue in seconds but wasn’t half as fun as playing around with computational linguistics. Back to writing reports, I guess.
While I keep working on the draft of my first actual blog posts, let’s see if I can also use this space to keep track of what I recently enjoyed reading:
- The Tale of Two Modernist Libraries (Architect Magazine, Dec. 16, 2015) on the ongoing transformation of Philip Johnson’s Boston Public Library and Mies van der Rohe’s MLK library in Washington DC. Not convinced about the metal cladding on the BPL building. Mecanoo’s intervention on MLK looks better, although losing that midcentury lobby will be a shame (somebody save those chairs!). They’re repeating the rooftop garden trick that seemed to have worked well in Birmingham, why not, although the rounded curves of the roof extension are out of character in a Mies building.
- Huh. Kodak unveiled a new Super-8 camera. Also its CEO has pretty cool looking business cards. They seem to be doing everything they can to save colour film, but I’m not sure it will be worth the hassle.
- iOS apps for coding, transmitting, displaying and dashboarding your work (Finer Things in Tech).
- Tor.com has a fascinating post about the process of fiction publishing, taking the latest George R.R. Martin title as an example. This infographic sums it up nicely.
- The End of the Dark Ages of Podcasting. Just because everyone knows about Serial (whose second season is kind of disappointing I must say) doesn’t mean podcasts are mainstream yet, at least not until discovery has been improved.
This week, I also learned that most of American English spelling can be traced to Noah Webster. He axed the extra u’s in colour and neighbour, changed offence to offense and cheque to businesslike check. He’s the one who insisted the letter “z” be pronounced “zee” instead of “zed” (he also wanted “y” to be called “yi” and “w” to become “we”). All this, and much more, from the first chapter of Mary Norris‘ Between You & Me, which is a true delight to read1.
- Nonrestrictive clause ↩
So this is it. After a 10-year hiatus, it looks like I’m ready to start writing on my website again. For a long time, timtom.ch has simply redirected visitors to my Flickr page, while other things, like our family blog (still private, sorry) and my notebook were only accessible to those who knew the URL to get there.
Well, hopefully this will change. I have recently felt the need for a blog of sorts again. A place to post tutorials, talk a little bit about my projects, maybe revive some of my photography, discuss random thoughts. But also maybe, just maybe, this Hello World post will remain here alone and unloved. Who knows.