Yak shaving with Vagrant, Travis-CI and AWS

Tldr; Don’t use the vagrant package from your distribution if you intent to build plugins.

I’ve just finished setting up CI pipeline for a personal project. The project has an Ansible playbook that I want to exercise every time there’s a commit or a PR. While completing the task I shaved a yak and narrowly avoided shaving a whole herd. I planned to use Vagrant in my Travis-CI pipeline to start an instance in AWS, run the playbook, look at the result and terminate the instance. Vagrant, Travis-CI and AWS are pretty common tools, so I was surprised at the wrangling involved before I ended up with a solution. I thought I’d document my findings to minimise the chance that others will have the same experience.

Finding #1: Travis’ default build agent has an old Vagrant

The default build agent is based on Ubuntu 12.04 LTS Server which ships with Ansible 1.0 in its apt repo. The vagrant-aws plugin requires Vagrant 1.2. Fortunately they have a Ubuntu 14.04 LTS Server beta which has a newer Vagrant in the apt repo.

Finding #2: The AWS-Vagrant plugin won’t build with a newer Vagrant because of missing libraries and tools

$ vagrant plugin install vagrant-aws
Installing the 'vagrant-aws' plugin. This can take a few minutes...
/usr/lib/ruby/1.9.1/rubygems/installer.rb:562:in `rescue in block in build_extensions': ERROR: Failed to build gem native extension. (Gem::Installer::ExtensionBuildError)

        /usr/bin/ruby1.9.1 extconf.rb
/usr/lib/ruby/1.9.1/rubygems/custom_require.rb:36:in `require': cannot load such file -- mkmf (LoadError)
    from /usr/lib/ruby/1.9.1/rubygems/custom_require.rb:36:in `require'
    from extconf.rb:4:in `<main>'

Searching for the mkmf LoadError in the context of Debian and Ubuntu gives recommendations to install a few devel packages including ruby-dev (some will say a specific version of ruby dev). It seems that AWS-Vagrant is a Ruby gem, and gems are built on-box (at least it seems so - I’m a Ruby rookie), so libraries and the ruby compiler are needed. These aren’t installed on the build agent.

$ sudo apt-get install build-essential libxslt-dev libxml2-dev zlib1g-dev ruby-dev

Finding #3: The AWS-Vagrant plugin needs Ruby version 2.0+

$ vagrant plugin install vagrant-aws
Installing the 'vagrant-aws' plugin. This can take a few minutes...
/usr/lib/ruby/1.9.1/rubygems/installer.rb:388:in `ensure_required_ruby_version_met': json requires Ruby version ~> 2.0. (Gem::InstallError)
    from /usr/lib/ruby/1.9.1/rubygems/installer.rb:156:in `install'
    from /usr/lib/ruby/1.9.1/rubygems/dependency_installer.rb:297:in `block in install'
    from /usr/lib/ruby/1.9.1/rubygems/dependency_installer.rb:270:in `each'
    from /usr/lib/ruby/1.9.1/rubygems/dependency_installer.rb:270:in `each_with_index'

vagrant plugin install wants to use Ruby 1.9.1 to perform the gem build, but that version is too old. Fortunately the build agent has Ruby 2.3.

Finding #4: Ubuntu Vagrant can’t load plugins built with Ruby 2.3

$ gem install --verbose vagrant-aws
...
Successfully installed vagrant-aws-0.7.2
41 gems installed
$ vagrant plugin install /home.travis/.rvm/gems/ruby-2.3.1/cache/vagrant-aws-0.7.2.gem
/usr/lib/ruby/1.9.1/rubygems/format.rb:32:in `from_file_by_path': Cannot load gem at [/home.travis/.rvm/gems/ruby-2.3.1/cache/vagrant-aws-0.7.2.gem] in /home/travis/build/edwinsteele/biblebox-pi (Gem::Exception)
    from /usr/share/vagrant/plugins/commands/plugin/action/install_gem.rb:36:in `call'
    from /usr/lib/ruby/vendor_ruby/vagrant/action/warden.rb:34:in `call'
    from /usr/share/vagrant/plugins/commands/plugin/action/bundler_check.rb:20:in `call'
    from /usr/lib/ruby/vendor_ruby/vagrant/action/warden.rb:34:in `call'
    from /usr/lib/ruby/vendor_ruby/vagrant/action/builder.rb:116:in `call'
...

Now I’m running out of ideas and I’m considering less conventional means like building on the Travis MacOS environment where I can use Homebrew or moving to another CI hosting provider entirely. Fortunately I stumbled on the answer…

The Answer

From the Vagrant docs

Beware of system package managers! Some operating system distributions include a vagrant package in their upstream package repos. Please do not install Vagrant in this manner. Typically these packages are missing dependencies or include very outdated versions of Vagrant. If you install via your system’s package manager, it is very likely that you will experience issues. Please use the official installers on the downloads page.

Yeah, I experienced issues. Once I followed the advice it was smooth. Perhaps I should have looked at the official docs sooner!

$ wget -O /tmp/vagrant.deb https://releases.hashicorp.com/vagrant/1.8.7/vagrant_1.8.7_x86_64.deb
$ sudo dpkg -i /tmp/vagrant.deb
$ vagrant plugin install vagrant-aws
$

And now I have a CI pipeline running on AWS after each commit. Nice.

Site Security Improvements

I did some security work on this site recently. I was able to get some nice wins without a great investment of time, in part due to the great resources that are available. Here are the areas of work, the resources that I used, and the outcomes:

Content Security Policy (CSP)

A CSP constrains the actions that a web page can take or the actions that can be performed upon it. It allows one to apply the principle of least privilege to a page and site. A CSP allows one to specify constraints like “Only Load CSS from these sources”, “Don’t allow this site to be embedded in frames” and “Don’t allow inline JavaScript”. I developed a CSP after reading a the HTML5rocks CSP tutorial and Scott Helme’s CSP intro. I validated my policy using Google’s CSP evaluator and Mozilla’s Observatory tool. In order to apply best-practices, which include disabling inline JavaScript and CSS, I needed to make a simple changes to the site. I’ve been conscious to minimise JavaScript and CSS as I’ve developed this site, and it was great to see how that choice made the application of best-practices a simple task.

Miscellaneous security headers

I implemented X-XSS-Protection, X-Content-Type-Options and X-Frame-Options and while the effect of these headers overlaps a little with CSP, providing them is still a good idea because of inconsistent CSP implementations and benefits unrelated to CSP. I learnt about them from Scott Helme’s Response Headers page and Mozilla’s web security guidelines. I validated my setup with the SecurityHeaders validation tool and Mozilla’s Observatory tool.

SSL

I already had a reasonable SSL setup but while looking at the Mozilla web security guidelines, I didn’t consider how my list of cipher choices would need regular updating (I’d last reviewed them 2 years ago!). Mozilla are good enough to provide nginx config snippets for to help with good cipher selection, and their config snippet included an HTTP Strict Transport Security (HSTS) directive. I’d considered HSTS before, but found the SSL certificate renewal process to be complex enough that I was unsure I wouldn’t accidentally take my site offline around renewal time. Having recently switched my site certificates over to the (awesome) Let’s Encrypt renewal process, I felt comfortable activating HSTS at the same time. I validated my setup with the Qualys SSL Report

Outcome

It took about 4 hours to make the changes, and after the changes were applied, this site 1 moved from an A to an A+ on the Qualys SSL Report. The Mozilla Observatory tool gives the site an A+ and the SecurityHeaders.io validator gives it an A. My nginx config is available on GitHub.


  1. Actually, I use Cloudflare as a CDN, so I ran the tests against (my origin server). 

Travelling through LAX just became a lot easier

I’ve transited Los Angeles Airport for many years on my way to conferences and family visits and on a recent trip I was thrilled to find that its no longer necessary to go through airport security when moving between international to domestic flights. While the signposting is pretty bad, one can now move between Tom Bradley International and Terminal 4 (and thus the rest of the domestic terminals) without exiting the terminal and subjecting oneself, and ones family, to security screening and potential delays.

Hooray.