Eastleigh 10K PB

March 24th, 2019

It was the Eastleigh 10K this morning and I managed to clock up a new PB of 35:34 – almost a minute faster than my previous fastest time on the course and just edging my previous 10K PB from 2003 (although admittedly that was on a hilly course at Ryde on the Isle of Wight).

Preparations had gone pretty well, managing to complete most of a Runner’s World training plan. A couple of overseas work trips threatened to derail things: the early morning’s around Cary were not particularly inspiring – the same cannot be said of running along the waterfront in Marbella! The effect on the waistline was likely to be the most problematic!

Conditions on the day were pretty spot on with cool temperatures and a light breeze. I started far too quickly but the pace didn’t feel uncomfortable and it wasn’t until around the 6k mark that I settled into something closer the pace I had been anticipating. I must have judged it about right as, despite my best efforts, I slowed a fraction for the final kilometre.

I’m looking forward to some time off road running now but the result was sufficiently encouraging to make me wonder whether I could still break under 35 minutes…

Exmoor Half-Term Hols

March 24th, 2019

A (very) belated blog post to record our half-term trip to Exmoor. The destination was selected as someone not too far away but where we haven’t spent much time before. As we drove there on Friday evening, what we’d failed to appreciate was how slow the roads would be to get there. The roads got narrower and narrower until, after passing Porlock Weir, we eventually reached a toll gate. I had started reversing back up the single-track road before Christine checked the details which indicated that we should pass through the arch, then through a tunnel under the coast path, which would take us to our randomly selected cottage (the left half of the building in the picture).

Reading through bumph left in the cottage it transpired that the cottage was part of the Ashley Combe estate, once owned by Lord William King, and where he honeymooned with his bride Ada Byron (to become Ada Lovelace) mathematician and colleague of Charles Babbage. The tunnels were part of an elaborate scheme to keep the tradesmen out of sight!

We woke to beautiful views north across the Bristol Channel to South Wales and east across the bay to Hurlstone Point. Our first day was fairly relaxed. We explored the private path down to the pebble beach and then drove back to Minehead to stock up on food. Christine and I took it in turns to explore, running from the cottage.

On Sunday, we drove to Bossington at the other end of the bay. From there, we walked up the headland to Selworth Beacon, returning via the National Trust tearooms at Selworth and an obligatory cream tea.

For the next two days, I was making the most of my remote working and saving a bit of vacation for later in the year. On Monday, the others went to Dunster Castle where they met up with their cousins and on Tuesday they explored on foot from the cottage.

Come Wednesday, I was back in holiday mode and we headed along the coast to Lynmouth. I then drove whilst the others took the cliff railway up to Lynton. From there we spent a few hours exploring the Valley of the Rocks. We returned to the cottage over Exmoor which looked particularly bleak at this time of year.

For our last full day we went south to Tarr Steps, a 17-span ‘clapper’ bridge. Our walk ended up being a little longer than anticipated as all the other crossing points on the river were under water until we reached Withypool (where the tea shop was not due to open for another week but the village shop ran to ice creams!).

Friday was departure day and we decided to stop off at Stourhead on our way home. The house was closed for its winter cleaning so we joined the masses on a lap of the lake as the sun broke through the morning mist.

It was a fun week but, other than perhaps to walk the Coast Path, I don’t think we’ll be rushing back.

Debugging with Telepresence

February 11th, 2019

I’ve spent the last few days trying to debug an issue on Kubernetes with an external plugin that I’ve been writing in Go for Prow. Prow’s hook component is forwarding on a GitHub webhook and the plugin mounts in various pieces of configuration from the cluster (the Prow config, GitHub OAuth token and the webhook HMAC secret). As a consequence, running the plugin standalone in my dev environment is tricky, but just the sort of scenario that Telepresence is designed for.

The following command is all that is needed to perform a whole host of magic:

  • It locates the my-plugin-deployment deployment already running in the cluster and scales down the number of replicas to zero.
  • It executes the my-plugin binary locally and creates a replacement deployment in the cluster that routes traffic to the local process on the exposed port.
  • It finds the volumes defined in the deployment and syncs their contents to /tmp/tp using the mount paths also specified in the deployment.
  • Although not needed in this scenario, it also sets up the normal Kubernetes environment variables around the process and routes network traffic back to the cluster.

Now, it was convenient in this case that the binary already exposed command line arguments for the configuration files so that I could direct them to the alternative path. Failing that, you could always use Telepresence in its--docker-run mode and then mount the files onto the container at the expected location.

And the issue I was trying to debug? I had used the refresh plugin as my starting point and this comment turned out to be very misleading. The call to configAgent.Start() does actually set the logrus log level based on the prow configuration (to info by default). As a consequence, everything was actually working as it should and my debug statements just weren’t outputting anything!

Website backup to pCloud

January 30th, 2019

Another SOC website related posting – this time on the subject of backup. The website is backed up by the club’s current hosting provider (Krystal – who, a year in, I can highly recommend) but I was informed that the club had bought a large quantity of cloud storage for the purpose of storing its map archive and, for belt and braces, it made sense to also include backups of the website there.

As it turned out, the cloud storage was courtesy of pCloud who are best described as a Dropbox clone i.e. the expected interaction patterns are via the web UI, mobile, or sync from the desktop app. A quick search turned up rclone which describes itself as “rsync for cloud storage” and, amongst the list of supported backends, includes pCloud.

Install on hosting provider was straightforward. The configuration process is interactive (opening a browser to log in to pCloud) but the docs also cover how to create the configuration on one machine and copy them across to another. A copy is then as simple as:

I started out looking to use drush arb to create a backup but, as the same hosting is used for a WordPress site, it was easiest in the end just to write a script using tar and mysqldump to create the archive of the file system and database tables. This is then triggered nightly on a cron job. Each backup is around 0.5GB so I wasn’t too concerned about incremental backup and, with 2 TB of storage to play with, it will be a while before the question of cleaning up old backups comes back to haunt me!

Drupal 8 Migration

January 28th, 2019

For my sins, I have now been involved in the management of our orienteering club’s current website for over 10 years now. Back then, we wanted to make it as easy as possible for club officials and members to contribute content and, after evaluating WordPress, Joomla! and Drupal, we went with Drupal as our Content Management System. The extensibility of Drupal makes it immensely powerful but, as with many open source projects, the rich ecosystem of contributed modules can be both a blessing and a curse.

Although the details have been long forgotten, I do remember that the move from Drupal 6 to 7 was a painful one and so, despite it being over three years since Drupal 8 was released, I was in no rush to migrate. In the end, it was a security vulnerability in one of the modules that wasn’t going to be addressed in v7 that precipitated the move.

The major changes in core Drupal have seemingly been too much for many module contributors to make the move. An initial assessment wasn’t particularly promising: of fifty-five non-core modules the current site had installed, five were no-longer needed in Drupal 8, six had GA v8 versions and a further fourteen had beta versions available. A migration estimate site put the effort involved at several weeks worth and, in the end, it probably wasn’t far off!

My first task was to slim down the number of modules installed. Many weren’t actively in use any more (e.g. content_access and views_data_export) and others had simple replacements which had easier migration paths (e.g. swapping out timefield for a simple text field). Ironically, the module with the security flaw was one of those that I disabled but, having started down this path, I was determined to complete a migration.

It was then time to start the actual migration. Thankfully the process now involves setting up a parallel site as it would still be weeks before I had anything that was approaching usable. One of the issues was that no private file path was set up during the migration. Another, that the migrated text formats were using a handler that no longer existed. Opening and resaving them fixed that problem. Another of the random error messages required manually modifying the database to remove the upload field from entity.definitions.bundle_field_map in the drup_key_value table (go figure).

The site makes extensive use of custom content types and views which are finally a part of core Drupal. Views are not part of the default migration though, and, in the end, I just recreated them manually. The same was true of all the patterns for pathauto.

At this point, with the styling also re-introduced, the site was ready to go live again but there were still problems waiting to be found. One was that, what used to appear as a date field, now appeared as a datetime field in forms. In the end, I decided to test out the new REST capabilities to export the contents of the field and reimport into a new field with the correct type. The only catch here was that there is no querying capability in the REST API so it was necessary to create a JSON-rendered view that listed the required nodes in order to retrieve their ids so that they could then be processed one-by-one. The rest was just a short bash script using curl and jq.

Hopefully, the migration can now be considered complete. The site now uses relatively few custom modules which is, undoubtedly, a good thing for future stability. If the move to Drupal 9 looks anywhere near as painful though, I now know how to extract the entire site content so maybe it will be time to revisit the CMS landscape. It would hate to think that I’ll still be debugging PHP errors in another ten years time!

Another Classic Weekend

November 4th, 2018

It was another two day’s of racing this weekend. On Saturday BAOC had an urban race around Winchester based at Peter Symonds college. Christine was resting her knee so it was just the children and myself competing. Their courses were confined to the college grounds with Duncan finishing 5th M12- and Emma 3rd W12- (although they were running the same course and Duncan actually beat Emma).

The navigation wasn’t particularly challenging with many long legs meaning there was lots of hard running to be done. With a late start, I knew what time I should be aiming for and things became increasingly frantic as I headed into the last five controls. Needless to say, I managed to waste time on the last two controls, but still managed to take first place. The time of 43 minutes looks more respectable in the context of the 10k I ran and not the 6.4k quoted for the course length!

Sunday brought the November Classic. We all started today although only because I’d entered Christine by mistake! There was light rain over Hampton Ridge whilst we were out (the picture above was taken later in the day). We met with mixed fortunes. Duncan had a good run, finishing second on M10A. Christine walked round a few controls before returning. Emma was out for over an hour without finding any of her controls. My legs didn’t feel too bad until the last part of the course. My downfall was repeatedly hunting for pits in the bracken which saw me finish in 5th place. Thankfully, no events planned for next weekend!

OMM White

November 4th, 2018

Last weekend it was the OMM in the Black Mountains, South Wales. Christine’s parents had offered to mind the children so Christine and I were running the Medium Score together. There was a biting wind but blue skies as we set off on Saturday morning. There was some early indecision but we soon settled down to a steady mountain marathon pace. As the morning went on, the skies started to look increasingly ominous and, as we cross one bit of particularly bleak hillside, the snow began and persisted for long enough to paint the mountainside white. We reached the campsite with around twenty minutes to spare – not long enough to have fitted anything else in.

It was a long night in the campsite, made more bearable by being able to chat to Christine brother and his wife in the tent next to us. Due to the cold, we both ‘slept’ in all of our clothes, including waterproofs. We were certainly glad to discover that, as third mixed pair, we qualified for the chasing start and had an hour less to spend in the campsite in the morning.

Although we removed a layer, we both kept our waterproofs on for the whole of the second day. Christine’s knee was giving her grief (a likely outcome even before we started the weekend) and, as a consequence, we were setting a pretty stately pace. We reined in our plans as we went round and, although we finished with another 25 minutes to spare, at the speed we were going it still wouldn’t have got us another checkpoint. We were 47th on the second day which brought us down from 13th to 28th over the two days. Still respectable but not what we would have hoped for had we both been fit and healthy. On the plus side, it did mean we could slip away before the prize giving and make it home in reasonable time!

If you watch the promotional video, you’ll catch a brief glimpse of us finishing on the first day around the 1:33 mark. Thanks to Christine’s dad who purchased the image above where we were reunited with the children at the finish. You can also find our routes from Day 1 and 2 on RouteGadget.

Oracle Code One: Continuous Delivery to Kubernetes with Jenkins and Helm

October 31st, 2018

Last week I was out in San Francisco at Oracle Code One (previously known as JavaOne). I had to wait until Thursday morning to give my session on “Continuous Delivery to Kubernetes with Jenkins and Helm”. This was the same title I presented in almost exactly the same spot back in February at IBM’s Index Conference but there were some significant differences in the content.

Continuous Delivery to Kubernetes with Jenkins and Helm from David Currie

The first half was much the same. As you can see from the material on SlideShare and GitHub, it covers deploying Jenkins on Kubernetes via Helm and then setting up a pipeline with the Kubernetes plugin to build and deploy an application, again, using Helm. This time, I’d built a custom Jenkins image with the default set of plugins used by the Helm chart pre-installed which improved start-up times in the demo.

I had previously mounted in the Docker socket to perform the build but removed that and used kaniko instead. This highlighted one annoyance with the current approach used by the Kubernetes plugin: it uses exec on long-running containers to execute a shell script with the commands defined in the pipeline. The default kaniko image is a scratch image containing just the executor binary – nothing there to keep it alive, nor a shell to execute the script. In his example, Carlos uses the kaniko:debug image which adds a busybox shell but that requires other hoops to be jumped through because the shell is not in the normal location. Instead, I built a kaniko image based on alpine.

The biggest difference from earlier in the year was, perhaps not unsurprisingly, the inclusion of Jenkins X. I hadn’t really left myself enough time to do it justice. Given the normal terrible conference wifi and the GitHub outage earlier in the week, I had recorded a demo showing initial project creation, promotion, and update. I’ve added a voiceover so you can watch it for yourself below (although you probably want to go full-screen unless you have very good eyesight!).