California Suitcase

My daughter recently developed an interest in rocks and minerals, and through exploration, museum shop visits, mail orders and charming the staff of a minerals workshop into selling her a rare chunk of real labradorite from Labrador, her collection has quickly grown past the series of plastic boxes it started with.

48 Onyx

I had been meaning to build her a display case, when I stumbled upon an antique movable type drawer at a reclaimed wood store last fall. As I was waiting for the bus with my recent acquisition in hand, a fellow rider asked me what I was going to do with the “California suitcase” I was carrying. As it turned out, he used to work for a Toronto printer, and this was how he had learned to call this particular piece of furniture.

Tinting

Over the coming weeks, we cleaned it up, brought the brass inlets to a shine using Bar Keeper’s Friend and a toothbrush, and then stained the wood to enhance the grain and increase the contrast with the brass.

Rocks on a shelfWe’re both very pleased with the result!

Sorting

Impressions from the Artist Project 2017

I like going to art fairs. Even though a future in which I can waltz through such an event with a chequebook and pick up original art for the walls of my perfectly decorated lodgings will probably remain a fiction, it is a fantasy that I like indulging in. And sometimes I do end up buying a print, and I never regretted it. Here are some of my favourite artists from this years Artist Project Toronto.

Duncan C. McLean – LOW-RES / HIGH-RISE

By reducing the resolution of his photographs of Toronto high-rises, McLean created a series of abstract images. The strict geometry of pixels meets that of the glass facades, while only slight variations in colour hint to the reflections of other buildings. Some look like Sim City blow-outs, others verge on the abstract. Simple, yet clever.

Low-resolution (pixellated) image of a high-rise building, at sunset.
LOW-RES / HIGH-RISE by Duncan McLean

Hugo Cantin – Mini-Cinema

While visually similar to McLean’s pixellated high-rises, Cantin’s film stock collages swap the digital for the analog. And it’s by zooming in, not out, that the abstract becomes real.

1955 Human Skeleton Classroom Documentary – 16mm Film Collage by Hugo Cantin

Emanuel Pavao – Tape Art

Pavao’s medium of choice also comes in rolls. His Toronto street scenes are entirely made of pieces of tape and often capture the grittier, stickier aspects of the city.

Toronto street scene depicting a truck covered in graffity, made with coloured tape.
Bold As Love – Tape Art by Emanuel Pavao

Marina Malvada

From the streets of Toronto to the realm of Canada’s singing astronaut Chris Hadfield, who reportedly owns some of Malvada’s otherworldly creations. Her acrylic planetary bodies straddle the borders between hyperrealism and the blurriness of imagination. Also, they would make a great cover for the next La Planète Bleue album.

Acrylic painting representing a blue planet in space, with a astronaut floating above it on the lower left corner.
On Top of the World 2014 24 x 36 Acrylic on Wood by Marina Malvada

Jordan Nahmias – The New (Old)

Back to photography, I enjoyed Nahmias’ moody series of shuttered motels and deserted desert towns.

Photograph of the sky and a mountaintop behind an old motel, with a vintage sign advertising
6 & 40 by Jordan Nahmias

Justin Blaynay – Figurative artwork

Full circle back to reducing artwork to its constitutive pixels.

A pattern of dots and ovals in grayscale display the face of a woman.
Sierra by Justin Blayney

Automating and sending speedtest.net data to web services

Recently I got frustrated by a series of broadband service failures. I realized they were difficult to diagnose both by me and my service provider (who, by the way, was very helpful) because it was difficult to determine when exactly they occurred and whether the issue was with the broadband connection or my wireless router. This weekend, inspired by this Make: Magazine feature, I hooked up a Raspberry Pi to my broadband router, set it up to periodically query speedtest.net (using speedtest-cli) and log the results.

I’m not a particular fan of IFTTT, which I find too linear and limiting (not to mention a certain arrogance towards third-party content providers) and thus I looked for alternative ways to post my speedtest results to an online place where I could obsessively check them whenever I’m out of the house. I liked this post describing how to use the same speedtest-cli with Loggly instead of IFTTT. But of course I wasn’t satisfied with hashing together a bunch of perl one-liners, so instead I found this script to manipulate speedtest-cli output, and modified it so it could log results to a CSV file, post them to IFTTT, Loggly or to any URL that would accept JSON, such as Zapier:

./speedtest-extras.sh [-d] [-c] [-h] [-i secret-key] [-l]
    -d: debugging-mode (reuses previously logged speedtest result instead of queriying speedtest - faster)
    -c: CSV mode
    -h: Print CSV header (only if used together with the -c flag)
    -i: IFTTT mode. Takes an IFTTT Maker Channel secret key as argument (required)
    -l: Loggly mode. Takes a Loggly Customer Token as argument (required)
    -j: JSON mode. Posts the result as a JSON document to any URL passed as argument (required)

My modified command-line interface to speedtest.net is available on GitHub, where I’ve also posted a few usage examples. Here, I will concentrate on how to use it to post to Zapier.

How to automatically send speedtest results to Zapier

First, take care of dependencies. My script makes use of speedtest-cli, which in turn is written in Python. Assuming you’ve got a working install of Python, you can use your favourite package manager to get hold of speedtest-cli:

$ pip install speedtest-cli

Then download my code, either as a ZIP archive or by using git:

$ git clone https://github.com/timtomch/speedtest-cli-extras.git

Once you have downloaded my repository, navigate to the bin folder1 that’s inside it:

$ cd speedtest-cli-extras
$ cd bin

Then you can try running my script in CSV mode to make sure everything is working properly:

$ ./speedtest-extras.sh -c
2016-03-29 02:33:38 UTC;2016-03-29 02:34:19 UTC;Start Communications;XXX.XXX.XX.XXX;SoftLayer Technologies, Inc. (Toronto, ON);8.53 km;17.794 ms;23.97 Mbit/s;1.95 Mbit/s;http://www.speedtest.net/result/XXXXXXXX.png

Depending on the speed of your Internet connection, it should take about a minute to run the test. If you see output similar to the above, things are working.

It is now time to setup Zapier to receive your data. If you haven’t got an account yet, go ahead and create one (the free plan should work just fine). Then click the bright red “Make a Zap” button to get started.

Using the search box, choose “Webhooks by Zapier” as your trigger, then select the “Catch Hook” option. Leave the next screen (options) empty and click Next until you reach a screen that should look like this:

Screenshot of the Zapier interface showing which URL to send JSON data to.
Setting up a Webhook on Zapier.

Zapier will issue a custom webhook URL to trigger your events. Copy that URL to the clipboard.

Now run

$ ./speedtest-extras.sh -j <PASTE YOUR ZAPIER URL HERE>

and wait again for the prompt to reappear. If nothing else shows up on your Terminal it’s a good sign. Go back to your browser and click the blue “OK, I did this” button. After a short while, Zapier should display a nice green message saying the test was successful. Go ahead and click on the “view your hook” link to check what data was sent to Zapier. You should see something like this:

Screenshot of the Zapier interface, showing data submitted via a JSON Webhook.
Testing the Zapier Webhook to ensure the JSON data was properly received.

Then you can decide what to do with that data. I chose to have each event add a new line to a Google Spreadsheet:

Screenshot of the Zapier interface, showing options to set up a Google Spreadsheets app.
Setting up Zapier to add rows to a Google Spreadsheet.

Go ahead and test your setup, then save your Zap once you are happy with the results. Don’t forget to turn on your Zap.

Now, every time you fire

$ ./speedtest-extras.sh -j <PASTE YOUR ZAPIER URL HERE>

Zapier will execute the operation you specified (add a row to a Google Spreadsheet in my example). Now, if you had to manually run the script to get a measurement, that would defeat the whole purpose, so the last step is to add a cron job so the script is run automatically:

$ crontab -e

This lets you edit your crontab. To run a speed test every hour, add the following line to it:

0 * * * * ./absolute/path/to/speedtest-extras.sh -j <YOUR ZAPIER URL>

Note that you need to specify the whole path to the speedtest-extras.sh script in your crontab for it to work.

Now watch the data slowly pile up, and start drafting that email to your broadband provider.

Next step: full Raspberry Pi tutorial?

A recent conversation with a friend facing the same issue made me think I could also write up a short tutorial on how to replicate my Raspberry Pi speed tester setup from scratch. Anything to avoid working on more useful things, like getting ahead on my MLIS research or freshening up my resume for this position I’m considering applying to…

  1. This directory structure is not entirely necessary but is a leftover from the original speedtest-cli-extras which I forked.

Link dump 2016/8: Open science, books on a ship, waves in the Alps, maker projects

Writing power tools

As I’m in the early stages of my final research project for my studies in Library Science, I’m looking at different ways to organise my thoughts and materials, and taking it as an opportunity to try some of the tools that are defining the current trend towards open and reproducible research. Things like version control can however quickly become complex and might scare away the bravest. It is certainly one of the most challenging topics I’ve had to teach during Software Carpentry workshops. And I’m far from understanding all of it. That’s why this Plain Person’s Guide to Plain Text Social Science looks to be a fantastic resource, laying out a complete workflow using open formats. As far as writing the actual paper, there is still no tool that will replace me. Although it might soon change, as a novel written by a computer almost won a literary price in Japan. The wind-up bird got creative.

Words in transit

I like it when a subway station is being refurbished and traces of the past are briefly brought to light again while walls are being resurfaced. This happened recently on the Paris Métro Trinité station. Together with a glimpse of swanky typefaces and yellowing memories, one learns in passing that this operation in French is called décarrossage.

The Royal Geographical Society recently digitized a series of photographs documenting Shackleton’s voyage on the Endurance, including this view of his travelling library:

Black and white view of Shackleton's library on board the ship Endurance.
Sir Ernest Shackleton’s library on the Endurance. Source: bbc.com

Of course they couldn’t resist trying to identify the contents of the library. And it is only a matter of time before Shackleton’s collection is dutifully catalogued on LibraryThing.

Earlier this month, the same Royal Geographical Society was also hosting my friend and land-art artist Sylvain Meyer for the annual conference of the Society of Garden Designers. I’m very happy that Sylvain is getting recognized for his fantastic work! I also miss my print of his early piece Ondulation, which I loaned to another friend when I left Switzerland.

Photo of a land-art installation in the Swiss Alps. The earth has been manipulated to represent a set of concentric circles.
Ondulation by Sylvain Meyer

Making extravaganza

I’m currently clearing some backlog on my feed readers. Last week I went through the map folder, today it’s the one on making. A bunch of posts were about the insane Wintergatan musical marble machine that took the Internet by storm a couple of weeks ago. Here are some other projects that jumped at me:

The other night I lost an hour of my life making an origami Darth Vader by following this instructional video by Tadashi Mori. Here’s an origami X-Wing fighter to go with it.

IMG_7247

Finally, I like a good project timelapse as much as the next guy, but this one is particularly entertaining.

Link dump 2016/7: Maps

I spent some time this week updating the Maps section of my feed reader, and I was glad to learn that Jonathan Crowe’s Map Room was back on the, ahem, map. Some posts I found interesting: Redrawing the London Tube Map, Mapping Swiss German Dialects1, the Ordnance Survey map of Mars, the 1936 Japanese Rail Network, a map of Paris’ pneumatic tube network and a mention of Around Switzerland in 80 Maps, from my friends at Helvetiq!

Ordnance Survey map of Mars
Ordnance Survey map of Mars [on Flickr]
Crowe also links to Why Children Still Need to Read (and Draw) Maps [PBS], which reminded me I still need to find a good atlas I can share with my daughter. I gave her a map of Ontario for a camping trip we did last summer so she could keep track of our journey. I agree that learning to use (and appreciate) maps is still an important life skill.

Further down in my inbox was this very interesting map of country TLDs, scaled by popularity, by way of the Strange Maps blog:

Map of the countries of the world, scaled according to the number of websites registered with their top level domains.
Map of the online world. Source: nominet.uk

Another good feed I subscribe to is Maps Mania, which unfortunately uses Feedburner to syndicate its content, with the result that only the briefest of snippets gets displayed in feed readers. This is not great for capturing the attention of the reader. I’m working on a way to use Huginn to create a nicer-looking feed, but I haven’t cracked the Feedburner nut yet. Anyways, here are a few posts that I found interesting: global flight connections map2, Rorschmaps and other Google Maps API hacks, mapping the word’s most boring (or interesting) roads by calculating how curvy they are, US Census name explorer and the UK surname map3.

All direct flight connections from Toronto's Pearson Airport.
All direct flight connections from Toronto’s Pearson Airport. Source: flightconnections.com

This heat map of public transit use in Toronto ties nicely with the everlasting debate on the city’s deficient transit infrastructure and the rampant cronyism that shapes it. It is encouraging to see some areas reach a healthy 50% of the population relying on public transit (which is not bad for a North American city). At the same time, one can’t help but see it as a map of income disparity, with well-connected wealthy neighbourhoods sticking to their cars, while many of those relying on public transit live nowhere near a mass transit line and probably don’t have much of a choice… Related: the Geography of Car Ownership in England and Wales and the visualization of rail station use in the UK.

Zipscribble maps aim to visualize how countries assign postal/zip codes. From the same source, the Travelling Presidential Candidate Map is a variant of the classic travelling salesman problem, computing the shortest path through all US ZIP codes.

Map of the USA showing the shortest line through all ZIP codes.
The Travelling Presidential Candidate Map. Source: eagereyes.org

Also one of my favourite visualizations from last year: the Isochrone maps of Europe by train. Also let’s not forget the Ultimate Crowdsourced Map of Punny Businesses in America.

Update: How could I have missed the awesome Super Mario style TTC map:

Map of the Toronto subway and RT system, drawn in the style of the Super Mario video game.
Toronto TTC Subway/RT map, Super Mario style. Source: davesgeekyideas.com/

Note that this map already includes the Spadina line’s extension, scheduled to open at the end of 2017. Isn’t it interesting to note that many such fan-fiction versions of the TTC map4 include future or imaginary lines? Cartographic wishful thinking…

Update 2: And here’s a map of the Toronto subway with the approximate walking time between stations5:

Map of the Toronto subway, with walking times between stations.
Approximate walking times between TTC stations. Creator: Pavlo Kalyta.
  1. this reminds me of the Chuchichästli-Orakel, which places visitors on a map of Switzerland based on how they pronounce 10 words, with uncanny precision.
  2. A good candidate for a possible post on travel planning tools, I think.
  3. Reminding me that I’m still looking for a data source for my idea of trying to map Switzerland’s patronyms by popularity…
  4. Listing my favourites here would be another idea for a post.
  5. Ditto

Migrating to a WordPress Network

I use this website to host a bunch of (mostly unrelated) services: wikis, my feed reader, and a couple of blogs for family members I like to keep separate. Those blogs used to each have their own WordPress install, which was not only a pain to keep up to date, but it also finally ate up all the SQL databases and subdomains I was allocated as part as my hosting plan. Setting up my wife’s new portfolio was an excellent excuse to find a better solution than to fire up yet another CMS instance. I decided to migrate the whole mess to a WordPress Network (previously called WordPress Multi-User). Which turned out to be much easier than I thought. Here’s how I did it and what I learned on the way.

1. Start fresh

I started with an (almost) fresh install of WordPress, the one that was powering this blog since November. Since I had used Softaculous to install it, I was able to setup automatic backups and updates while I was at it. I decided to move it to a subdirectory first to clean things up a bit on my home directory. According to the documentation, this would prevent me from creating subdomain sites (e.g. things like this.timtom.ch1 and that.timtom.ch) but I was able to find a way around this limitation by using the WP subdomain plugin, more on that later.

After moving WordPress to a new subdirectory, I checked that everything was still working on the main site. Since I already had a few posts live on that WordPress install, I backed everything up for good measure before I started the process.

2. Enable the Network feature

This is as easy as adding a single line to wp-config.php, and clicking through a few options on the admin interface. Since I was now running WordPress in its own directory, I knew running my Network under the subdomain model (this.timtom.ch) would not be straightforward, so chose to run it under the subdirectory model instead (timtom.ch/that).

Once my embryonic network was set up, I verified that the main site (this blog) was still working fine.

3. Import the other blogs into the new network

For each of the standalone WP instances I wanted to replace, I exported all content using the Tools > Export function.

Back in my Network admin interface, I then created a new site for each of them. I didn’t worry too much about naming the new sites, knowing I would fix their addresses later on. I chose unique names that I knew would not be conflicting with any pages I’d like to create on this blog in the future. I named them something like timtom.ch/sub-familynews etc.

Before importing the WXR files I had exported out of the old sites, I needed to install the WordPress Importer plugin. Despite being an official WP plugin, it unfortunately has a pretty bad reputation, and justifiably so, because of its poor interface and error management. It basically gives no feedback during the import process, which is unnerving and problematic if anything goes wrong. Fortunately, nothing bad happened to me. I imported each blog into the new sites I had just created, making sure to reallocate posts to the users that already had an account on my network, or to allow WordPress to create new accounts for those that didn’t. I chose to import all media, which is important since it will make a new local copy of all images and files that were referenced in the old blogs. Since I planned to delete the old blogs once the process was over, copying media was essential. I then armed myself with patience and a cup of herbal tea while the import plugin did its unnerving thing.

Once each import was done, I visited the new sites and made sure everything was in order.

4. Create subdomain redirects

I now had a suite of sites (e.g. timtom.ch/sub-familynews, timtom.ch/sub-projectx etc.) mirroring the old independent installs of WordPress that were all living in subdomains (e.g. familynews.timtom.ch). Since I wanted all URLs to continue working, I now had to map the old URL structure to the new sites.

I started by renaming each of the old blogs by 1. doing a full backup (or maybe I don’t, but if you’re following this, you should), 2. change their URL in Settings > General (the blog will instantly stop working, but not to worry) and then 3. rename the subdomain they’re operating in accordingly, e.g. in cpanel. I ended up having all my old blogs living at addresses such as projectx-old.timtom.ch. I will likely keep them around for a short while to make sure all is well with the new sites, before deleting them and free up some badly needed database space.

Then it was time for magic. I started by installing the WordPress MU Domain Mapping plugin, setting it up (a file with the slightly worrying name of sunrise.php notably has to be copied away from the plugin directory) and network-activating it.

I then went back to cpanel and created a new “alias” (also known as “parked domains”) for each of the subdomains I needed for my sites. Yes, even though they were all subdomains of timtom.ch (e.g. projectx.timtom.ch), I still needed to treat them as aliases for this to work:

Creating a new alias in cpanel
Creating a new alias in cpanel.

All the aliases I thus created point to the main timtom.ch directory. At first I thought I had to redirect them to the subdirectory in which my main WordPress install lives, but it turned out to be wrong. All subdomain aliases have to point to the home directory of your site for this to work.

As an aside, I found that I was able to make this work only by creating “aliases” using the procedure above. Merely adding a type “A” record in my host’s DNS using cpanel’s “Advanced Zone Editor” didn’t work, probably because the IP address my site uses is shared with other customers. The “alias” function probably makes the required extra settings so that any DNS entries point to the virtual server that’s allocated to me.

Back in WordPress, I then assigned the new subdomains to each of my new sites. The interface to do so is in My Sites > Network Admin > Settings > Domains. Unhelpfully, WordPress MU Domain Mapping’s interface asks for the “site ID” of each site to set this up, which isn’t that obvious to find out. One way I found to identify which ID corresponds to each site is to navigate to the Sites list panel of WordPress Network admin and hover over each site name. The ID will be visible in the URL for each site:

Screenshot of the WordPress Site admin, showing the mouse cursor hovering over a site URL to reveal its ID.
How to identify the site ID of a WordPress Network Site.

Once this is done, the last step was to set up each site’s main URL to the new subdomain, this is done in the Settings > General tab of each site.

Then for a short while none of the new sites worked, which was normal, as the new DNS information didn’t have time to propagate through the Internet yet. This can take up to one hour, depending on settings, so it was a good time to do something else, like starting to work on this blog post!

Once the DNS information was fully propagated, I verified that each of my sites was now working well, each in their own subdomain! I verified that the permalink structure was still the same that I was using for each of my old sites, so that the URLs to the posts and pages were still the same. Migration complete!

I am now the proud owner of a WordPress Network and I can create a new site in a few minutes. All I need once I created the site is go to cpanel, register a new subdomain using the “Alias” function and then assign it to the new site.

5. Fixing HTTPS

There was one extra step in my case since I’m using HTTPS encryption on this website (and you should too) and wanted it to work across all my subdomain WordPress sites too. The certificate I had for timtom.ch did not contain the subdomains I had just created, therefore my browser raised a security alarm when I tried to navigate to my new subdomains using HTTPS. Since I’m now using the Let’s Encrypt cpanel module to handle encryption, the only way to alter the certificate and include my new subdomains was to delete the old certificate and immediately create a new one. I then made sure to include all the new subdomains when creating the new certificate, and bingo, instant HTTPS across all my sites.

There were a few remaining caveats, however. Since the blogs I had just imported were not using HTTPS in the past, all the images I had embedded from Flickr were using HTTP in their <img> tags and thus raising mixed-content errors. I therefore had to go through all the posts that were affected and make sure all <img> tags were using HTTPS.

  1. N.B. all the URLs and directories mentioned here are examples and not actual URLs to anything on this site

Link dump 2016/5: Facebook colonialism, fake minimalism, SciHub and Foucault’s Pendulum

Migrate from Aperture to Lightroom

Edit (January 2019): It took me a long time to fully import my Aperture libraries to Lightroom! I started this process in February 2016 and migrated my last pictures in January 2019. A few things have since then changed, so I’ve updated this post accordingly.

A TL;DR version of this post is available below, summarizing the steps in my migration workflow.

The release of Aperture in 2005 aligned well with my increasing interest in photography at the time. Like many Apple products before, Aperture offered a unique combination of powerful features, ease of use, and good integration with other Apple products and workflows. After years of trying out various tools to (first) organize and (as I started researching photographic techniques) retouch my photos, Aperture was combining both in a unique way. I was convinced and soon bought a copy of the software. And again, like many Apple products before, a competitor soon released a very similar product. Adobe Lightroom came out in early 2007, and quickly become the market leader.

Even as I watched many of my photography friends progressively adopting over to Lightroom, correctly identifying the winning horse early on, I refused to see the writing on the wall and stuck to Aperture. My behaviour in that aspect is pretty acutely described in Alex Allen’s essay “Why we treat tech companies like religions” (mentioned on Spark #295):

And when you’re that tied in to an ecosystem, you have a motive to want to defend and validate the choice you’ve made

But of course all my photography friends were right to choose Lightroom, as demonstrated by Apple finally announcing they were pulling the plug in 2014. We are now in 2016 and I believe my period of mourning denial has lasted long enough. It is time for me to migrate to Lightroom. I spent some time thinking of the best way to do this and experimenting, and I’ve decided to document the process here in case someone finds it useful.

1. Upgrade to Aperture 3.6

Lightroom has a handy feature for importing Aperture libraries (Adobe being all to happy to welcome the Aperture castaways…). After briefly considering starting from scratch, I decided to use it to at least import all the metadata I had been carefully adding to my pictures (geolocation, tags, etc.). However the Lightroom import wizard warned me that my version of Aperture wasn’t the latest, and that it could cause some errors while importing, without however offering any details on what exactly would be impacted. I wasn’t able to find more information online about that either. In the end, I decided to upgrade to the latest version of Aperture, which wasn’t entirely straightforward with my setup. I have an older Mac Pro which I am loathe to run on the latest OS X version for performance issues. The last version of Aperture (3.6) does not run on anything prior to Yosemite, however. I ended up upgrading my Mac Book Air to El Capitan, install the latest Aperture on it and use it to upgrade my libraries. Convoluted, yes.

Edit (January 2019): I did eventually upgrade my Mac Pro to El Capitan and was able to finish my migration with fewer complications.

To make things even easier, I had bought the software (and any subsequent updates) as a physical DVD (2005, early adopter, remember?). But Apple at some point decided to distribute updates only through their App Store. Updates are only available to users who have previously bought the software on the App Store. Problem for me, since I had purchased a boxed version of the software. After scouring the forums for a solution that did not involve downloading a shady torrent file, I found mention of a user in the same situation who got Apple Support to add Aperture to their App Store purchases. I dutifully called the support line (my favourite activity in the universe). Let’s just say it wasn’t a short call. And it required me refusing to purchase Aperture again at some point (sic) and being a big enough pain that I finally got redirected to someone who understood the situation and provided me with a one-time code to “purchase” Aperture at no cost on the App Store. I was then able to upgrade to the latest version. Finally.

2. Split my library into manageable chunks

Now armed with the latest (and last) version of Aperture, I tried starting the Aperture import wizard in Lightroom, which asked me to locate an Aperture library and then started “verifying” it before offering me any further option. I waited several hours before eventually aborting the operation, since there was no sign how much longer this would take. It was evident my Aperture library was too large to be imported in one go, I would have to split it into smaller chunks. Doing it in batches also made me less nervous about missing a step and then having to start all over again, or messing everything up somehow.

I decided to do this by year. I first created a series of smart collections, filtering by date

Then I exported each as a separate library. I didn’t check the boxes to copy the originals or the previews into the exported library. I wanted to keep the originals in their current location. As for the previews, I knew I was going to have to rebuild them anyways (see next step).

The only drawback from this method is that Aperture doesn’t recreate the full collection structure when creating sub libraries this way:

Original Aperture library
New Aperture library after filtering by date
After import in Lightroom

I only realized this after importing a few year’s worth of images into Lightroom. This is annoying but not really a big problem, as I can always filter images by date or keyword and recreate most of these sub-collections in the future if I ever need to.

3. Preserving adjustments – kind of

Sadly, I knew that all nondestructive adjustments made to my photos in Aperture would be lost when importing into Lightroom. However, the Lightroom importer will try to import “high quality previews” if they exist. So to at least keep a flattened version of the photos I had edited in Aperture, I made sure to generate a fullsize preview of all edited photos. I started by making a smart library containing only images to which I had applied adjustments:

I then set the previews to full-size by selecting the largest possible size and quality setting in Preferences:

Finally I forced Aperture to recreate a full-size preview for all those images by selecting all images in my smart collection, and selecting Photos->Generate Previews while holding down the Option key (this will force generate them even if they are up to date). This increased the size of my Aperture libraries significantly (by a factor of 3), which was expected.

Edit (January 2019): I came back several months (OK years) later to finish the import process for my older images and my library wasn’t increasing in size during that step, which I found odd. I checked the location of the referenced files and sure enough, Aperture had lost track of the location of the originals. I fixed it by using File->Locate Referenced Files before reprocessing the previews.

I repeated this process for each of the yearly libraries I had created in the previous step.

When I first tried to import my new smaller Aperture libraries into Lightroom, I realized the importer was taking forever trying to validate the last opened Aperture library before letting me select which one I wanted to import. To stop wasting time on this step, I created a new empty Aperture library and set it as my current library before switching to Lightroom to start the import process.

4. Import into Lightroom

Edit (January 2019): It turns out that the Aperture importer plugin has a tendency to hang after importing 1000 images or so in recent versions of Lightroom Classic. The workaround is to install version 6.0 of Lightroom Classic (from 2015, then called Photoshop Lightroom CC) and use that version to run the importer. Older versions can be installed from the Adobe Creative Cloud app by clicking the little arrow next to the Install/Update buttons and choose Manage>Other versions:

I decided to import the photos into a blank Lightroom library, to limit interference with existing workflows. After creating a new library, I then ran the Import from Aperture plugin, selecting each of my smaller Aperture libraries in turn to import from, and setting the target path to be consistent with my new Lightroom workflow.

In the options panel, I made sure to check the first checkbox to import the full-size previews for adjusted images I had generated in the previous step. I also opted to leave the referenced files in their current location on my hard drive.

I then took a deep breath and clicked Import. I found the process to be reasonably quick, about 20-30 minutes for each year of photos. After each import, I checked that the images were all properly referenced, checked a few folders and keywords. At first I was alarmed that I was ending up with many more pictures in Lightroom than in the original Aperture libraries, before remembering that the additional images came from the generated previews.

5. Cleanup

Aside from the fact that the adjustments I had made in Aperture couldn’t be imported and would have to be replicated one by one if I ever needed to reprocess an image, I found the transition to be reasonably smooth. I only realized later in the process that using smart albums to split my Aperture library into smaller libraries lost some granularity in my collections structure, but that’s a relatively minor annoyance. Some metadata cleanup was necessary, notably I noticed that accented characters in keywords appear to have been improperly converted. I decided to fix those over time as needed.

The other cleanup operation that I had to do was that some images (notably scanned images, and some coming from older cameras and smartphones) were not appearing in the right order. It turns out, Lightroom relies on the “Capture date” field to order images in chronological order and that field wasn’t properly filled for those images. This can be easily fixed by filtering by date unknown and then adjusting the capture time to mirror the file creation date for those images.

Screenshot of the Lightroom Metadata Filter, displaying photos with unknown dates.
Screenshot of the Lightroom fix date setting.

In the end, I realized that what took me the most time (apart from convincing myself to move to Lighroom already) was figuring out each step of this workflow. Once I had them in place, it was a matter of replicating a series of operations and letting the importer do most of the work. Since it took me a bit of research to put this workflow together, I figured it might interest someone else, thus I documented it here. Hope it helps!

TL;DR – My Aperture to Lightroom migration workflow

  1. Be sure to run the importer in Lightroom version 6.0 (2015). The importer doesn’t work well in newer versions. You can install older versions of Adobe apps by clicking the little arrow next to the Install button in the Adobe Creative Cloud app.
  2. Make sure you have a recent backup of your Aperture library, just in case.
  3. Check if you have images in the Aperture trash. Either empty it before going further, or if you want to keep your trashed images, apply a tag to them, because Lightroom will import them and mix them up with the others. Note that you can’t add a keyword to images when they’re in the trash, so you might need to put them in a temporary collection, add your tag or mark them as “rejected”. Note that if you use smart albums to split your library, the trashed images will not be imported.
  4. Split your Aperture library into smaller ones (e.g. by date) by creating smart albums with the proper limiters and then exporting each album as an Aperture library.
  5. For each of the smaller Aperture libraries:
    1. Make a smart album with the “Adjustments… are applied” rule
    2. Go to Preferences, under Previews select Photo Preview: Don’t limit
    3. Select all images in your adjustment smart album, press the Option key and select Photos->Generate previews (this will force generate them even if they are up to date). This will increase the size of your library significantly. If it doesn’t, something is probably not quite right! Start by checking that Aperture can locate all original files (File->Locate Referenced Files) and fix if necessary before trying again.
  6. BEFORE starting the import process in Lightroom, make a new empty Aperture library and exit Aperture with this library active. This will reduce the time Lightroom will try to “check” Aperture’s library when launching the importer.
  7. Import each of your Aperture libraries by using Lightroom’s Import from Aperture plugin, making sure to set the target location for referenced files to be consistent with your current Lightroom workflow. Click the Options panel and select all checkboxes to extract the full size previews you created in step 5 and import them as separate files, and to keep the referenced files in their current location instead of duplicating them.
  8. Once everything has been imported, cleanup your metadata as required, checking for missing capture dates.

Link dump 2016/4: NoSQL, Mars littering, dams in disgrace and the lost pictures of Antarctica