Ansible Tasks for Installing Single-Binary Software from GitHub Releases
For lots of useful command-line utilities written in languages like Go and Rust that compile down to single binaries, often the easiest way to get the most recent version for your operating system is to simply grab the binary from the project's GitHub Releases page and drop it somewhere in your $PATH
. If you are managing a heterogeneous environment, such as different servers running different Linux distributions and developers running MacOS, relying on the native package managers to keep everything in sync and on the same version, installation, and configuration can range from mildly annoying to wildly impractical. This is even more acute if you need to pin to a specific version of a tool. NixOS is a holistic approach to this issue, but if you can't or don't want to go all-in on NixOS, using GitHub releases is one alternative for smaller-scoped use cases.
Here is an example Ansible playbook that can install multiple versions of single-binary software from GitHub releases, using two of my favorite tools, pet and ghq, as examples.
Overview
├── playbook.yml
├── tasks
│ ├── ghq.yml
│ ├── github_release.yml
│ └── pet.yml
├── templates
│ └── pet-config.toml.j2
└── vars
├── ghq.yml
└── pet.yml
playbook.yml
---
- hosts: default
remote_user: my_user
vars:
binary_path_dir: "{{ ansible_env.HOME }}/.local/bin"
github_release_work_dir: /opt/github
vars_files:
- vars/pet.yml
- vars/ghq.yml
tasks:
- name: Ensure necessary directories exist
ansible.builtin.file:
path: "{{ item }}"
state: directory
owner: my_user
group: my_user
become: yes
become_method: sudo # Presumes my_user has sudo privileges on target machine
loop:
- "{{ binary_path_dir }}"
- "{{ github_release_work_dir }}"
- name: Install pet
include_tasks: tasks/pet.yml
- name: Install ghq
include_tasks: tasks/ghq.yml
This is the main playbook file that would be invoked with ansible-playbook
. This will set up a workspace in /opt
and includes the other task files for setting up pet
and ghq
.
vars/
Set variables for configuring the tools here. pet is configured with a TOML file in ~/.config/pet
while ghq can be configured with environment variables.
pet.yml
---
pet:
general:
column: 40
selectcmd: fzf --ansi # Presumes fzf is already installed
backend: gitlab
# specify how snippets get sorted (recency (default),
# -recency, description, -description, command, -command, output, -output)
sortby: recency
gitlab:
file_name: pet-snippet.toml
access_token: ""
url: ""
id: ""
visibility: private
auto_sync: "false"
ghq.yml
---
ghq:
root: "$HOME/projects"
templates/pet-config.toml.j2
And here is the template to generate the TOML file for pet
, using the variables in vars/pet.yml
.
[General]
snippetfile = "{{ ansible_env.HOME }}/.config/pet/snippet.toml"
editor = "{{ ansible_env.EDITOR }}"
column = {{ pet.general.column }}
selectcmd = "{{ pet.general.selectcmd }}"
backend = "{{ pet.general.backend }}"
sortby = "{{ pet.general.sortby }}"
{% if pet.github is defined %}
[GitHub]
file_name = "{{ pet.github.file_name }}"
access_token = "{{ pet.github.access_token }}"
gist_id = "{{ pet.github.gist_id }}"
public = "{{ pet.github.public }}"
auto_sync = {{ pet.github.auto_sync }}
{% endif %}
{% if pet.gitlab is defined %}
[GitLab]
file_name = "{{ pet.gitlab.file_name }}"
access_token = "{{ pet.gitlab.access_token }}"
url = "{{ pet.gitlab.url }}"
id = "{{ pet.gitlab.id }}"
visibility = "{{ pet.gitlab.visibility }}"
auto_sync = {{ pet.gitlab.auto_sync }}
{% endif %}
pet
can sync your snippets to your GitHub or GitLab accounts, so there is conditional logic to see which one(s) are configured. Note that in vars/pet.yml
, auto_sync
is set to a string "false"
while the template is unquoted. That value must be a boolean, and Ansible/Jinja would interpret a YAML false-y value as a Python boolean False
and write it as such (with a capital F) in the template, which would be a syntax error in Go.
tasks/github_release.yml
---
- name: Set version dir
set_fact:
version_dir: "{{ github_release_work_dir }}/{{ org }}/{{ repo }}/{{ version | default('latest') }}"
- name: Set version data json file
set_fact:
version_data_json_file: "{{ github_release_work_dir }}/{{ org }}/{{ repo }}/version_data.json"
- name: Initialize should_update flag to true
set_fact:
should_update: true
- name: "Check latest version of {{ org }}/{{ repo }}"
ansible.builtin.uri:
url: "https://api.github.com/repos/{{ org }}/{{ repo }}/releases/{{ version | default('latest') }}"
return_content: yes
register: version_data
- name: Check if version has been previously downloaded
ansible.builtin.stat:
path: "{{ version_data_json_file }}"
register: previous_download
- name: Check if newer version exists, if latest version requested
block:
- name: Read version_data.json
ansible.builtin.shell: "cat {{ version_data_json_file }}"
register: result
ignore_errors: yes
- name: Set currently installed version JSON data
set_fact:
currently_installed_version_json_data: "{{ result.stdout | from_json }}"
- name: Set github version published_at
set_fact:
github_version_published_at: "{{ version_data.json.published_at | to_datetime('%Y-%m-%dT%H:%M:%SZ') }}"
- name: Set installed version published_at
set_fact:
installed_version_published_at: "{{ currently_installed_version_json_data.json.published_at | to_datetime('%Y-%m-%dT%H:%M:%SZ') }}"
- name: Update should_update flag
set_fact:
should_update: false
when: installed_version_published_at >= github_version_published_at
when: previous_download.stat.exists and (version is not defined or version == "latest")
- name: Set installed version tag name
set_fact:
installed_version_tag_name: "{{ version_data.json.tag_name }}"
- name: "Download {{ repo }} and link to {{ version | default('latest') }} version"
block:
- name: Ensure repo download directory exists
ansible.builtin.file:
path: "{{ version_dir }}/extract"
state: directory
- name: "Install {{ repo }} {{ installed_version_tag_name }}"
ansible.builtin.unarchive:
remote_src: yes
src: "{{ asset.browser_download_url }}"
dest: "{{ version_dir }}/extract"
loop: "{{ version_data.json.assets }}"
loop_control:
loop_var: asset
when: "filename_substring|string in asset.name"
- name: Write version_data to file
ansible.builtin.copy:
dest: "{{ version_data_json_file }}"
content: "{{ version_data }}"
when: should_update | bool
This is the core task module that handles fetching specific versions from GitHub and contains logic to determine whether there is a new version that should be updated on the target machine. Let's step through it:
- name: Set version dir
set_fact:
version_dir: "{{ github_release_work_dir }}/{{ org }}/{{ repo }}/{{ version | default('latest') }}"
- name: Set version data json file
set_fact:
version_data_json_file: "{{ github_release_work_dir }}/{{ org }}/{{ repo }}/version_data.json"
- name: Initialize should_update flag to true
set_fact:
should_update: true
This sets "facts", essentially Ansible-speak for setting a variable that can be referenced later. The first task sets the workspace path using variables that are meant to be passed in by the task or playbook calling this task file. The second task sets the path of the JSON data returned by the GitHub API to be stored as a file on disk for comparing installed versions against newly published versions. The third task is pretty self-explanatory, we are assuming an update (or installing for the first time if the software has never been installed before) unless the existing version is older than the latest GitHub version.
- name: "Check latest version of {{ org }}/{{ repo }}"
ansible.builtin.uri:
url: "https://api.github.com/repos/{{ org }}/{{ repo }}/releases/{{ version | default('latest') }}"
return_content: yes
register: version_data
Releases are included in the GitHub API, so this task uses the API to fetch structured information about the release and stores it in a variable called version_data
.
- name: Check if version has been previously downloaded
ansible.builtin.stat:
path: "{{ version_data_json_file }}"
register: previous_download
- name: Check if newer version exists, if latest version requested
block:
- name: Read version_data.json
ansible.builtin.shell: "cat {{ version_data_json_file }}"
register: result
ignore_errors: yes
- name: Set currently installed version JSON data
set_fact:
currently_installed_version_json_data: "{{ result.stdout | from_json }}"
- name: Set github version published_at
set_fact:
github_version_published_at: "{{ version_data.json.published_at | to_datetime('%Y-%m-%dT%H:%M:%SZ') }}"
- name: Set installed version published_at
set_fact:
installed_version_published_at: "{{ currently_installed_version_json_data.json.published_at | to_datetime('%Y-%m-%dT%H:%M:%SZ') }}"
- name: Update should_update flag
set_fact:
should_update: false
when: installed_version_published_at >= github_version_published_at
when: previous_download.stat.exists and (version is not defined or version == "latest")
If the calling playbook task requests the latest version (the default if omitted), this block of tasks reads in the installed_version.json
file to compare the published_at
times from the currently installed version. This makes use of several Ansible/Jinja2 filters, such as parsing JSON using from_json
and datetimes using to_datetime
. For some reason, the default to_datetime
format is not ISO8601, so I had to pass in the Python format codes.
If the user requests the latest version, and there is no newer version published on GitHub, the should_update
flag gets set to false
and the remaining tasks will not be run.
- name: Set installed version tag name
set_fact:
installed_version_tag_name: "{{ version_data.json.tag_name }}"
This sets a fact to the latest version's tag name, which may or may not come in handy later. There is no standard format or structure to GitHub releases, and some will have the tag name in the folder or files that are unpacked. You might need to refer to the path containing that tag name when symlinking the binary.
- name: "Download {{ repo }} and link to {{ version | default('latest') }} version"
block:
- name: Ensure repo download directory exists
ansible.builtin.file:
path: "{{ version_dir }}/extract"
state: directory
- name: "Install {{ repo }} {{ installed_version_tag_name }}"
ansible.builtin.unarchive:
remote_src: yes
src: "{{ asset.browser_download_url }}"
dest: "{{ version_dir }}/extract"
loop: "{{ version_data.json.assets }}"
loop_control:
loop_var: asset
when: "filename_substring|string in asset.name"
- name: Write version_data to file
ansible.builtin.copy:
dest: "{{ version_data_json_file }}"
content: "{{ version_data }}"
Here we actually download the release from GitHub, unpack it into our workspace, and update the installed_version.json
metadata file. I created a separate containing "extract" directory because if the extracted binary is the same name as the repo it would sometimes cause weird issues, like Ansible seemingly silently not extracting or discarding the binary if the dest
directory had the same basename
.
Ansible's unarchive module nicely abstracts away the nitty-gritty and labrynthine flags and options for unpacking various archive formats. If you pass remote_src: yes
and src
contains ://
it will also assume it's a remote link and will download and unpack it, all in one step!
We also make use of loops in Ansible to loop over the array of assets from the GitHub API response to find the one that matches the filename_substring
variable passed in by the calling playbook or task. A project could have many downloadable assets for the same release version, even for the same operating system and architecture. You should be as explicit as possible for the substring to look for, because it will obviously stop on the first one that matches. The loop_var
is set to asset
instead of the default item
, just to be on the safe side for a scenario such as this task being run in an outer loop.
Next are the task files for pet
and ghq
that utilize this github_release
task.
tasks/pet.yml
---
- name: Fetch pet from GitHub
include_tasks: github_release.yml
vars:
org: knqyf263
repo: pet
filename_substring: linux_amd64.tar.gz
- name: "Symlink binary to {{ binary_path_dir }}"
ansible.builtin.file:
src: "{{ version_dir }}/extract/pet"
dest: "{{ binary_path_dir }}/pet"
state: link
force: yes
- name: Symlink completions file to /usr/share/zsh/site-functions
ansible.builtin.file:
src: "{{ version_dir }}/extract/misc/completions/zsh/_pet"
dest: /usr/share/zsh/site-functions/_pet
state: link
force: yes
owner: root
group: root
become: yes
become_method: sudo
- name: Ensure config directory exists
ansible.builtin.file:
path: "{{ ansible_env.HOME }}/.config/pet"
state: directory
- name: Generate pet config from template
ansible.builtin.template:
src: pet-config.toml.j2
dest: "{{ ansible_env.HOME }}/.config/pet/config.toml"
The first task invokes the github_release
task and passes in the required variables. You can determine what filename_substring
to use from looking at the releases page. If you're on MacOS, for example, you would set filename_substring
to darwin_amd64
.
After the github_release
task completes successfully we should have the extracted binary in our git workspace. set_fact
makes the fact available to subsequent playbooks/tasks, so we refer to the version_dir
to find the binary and symlink it to our binary_path_dir
.
The pet
release also includes shell completions for both bash
and zsh
. Since I'm a zsh
user, I symlink the appropriate completion file from the unpacked workspace to the standard zsh
completions directory of /usr/share/zsh/site-functions
. Since that is system-wide, be aware that my_user
must have passwordless-sudo privileges on the target machine.
Finally, we ensure the configuration directory exists and then generate the TOML file for it from our template.
tasks/ghq.yml
---
- name: Fetch ghq from GitHub
include_tasks: github_release.yml
vars:
org: x-motemen
repo: ghq
filename_substring: linux_amd64
- name: "Symlink binary to {{ binary_path_dir }}"
ansible.builtin.file:
src: "{{ version_dir }}/extract/ghq_linux_amd64/ghq"
dest: "{{ binary_path_dir }}/ghq"
state: link
force: yes
- name: Symlink completions file to /usr/share/zsh/site-functions
ansible.builtin.file:
src: "{{ version_dir }}/extract/ghq_linux_amd64/misc/zsh/_ghq"
dest: /usr/share/zsh/site-functions/_ghq
state: link
force: yes
owner: root
group: root
become: yes
become_method: sudo
- name: set GHQ_ROOT in zshenv
ansible.builtin.lineinfile:
path: "{{ ansible_env.HOME }}/.zshenv"
search_string: GHQ_ROOT
line: "export GHQ_ROOT={{ ghq.root }}"
As you can see, it's mostly identical to the pet
playbook — invoking github_release
and symlinking the binary and completions files. Instead of a config file, we set the environment variable in ~/.zshenv
(bash
users would set the lineinfile
path to ~/.bash_profile
).
If you want a specific version of ghq
, instead of "latest", you could specify the version
as a var
in the task, for example:
- name: Fetch ghq from GitHub
include_tasks: github_release.yml
vars:
org: x-motemen
repo: ghq
filename_substring: linux_amd64
version: v1.1.5
Conclusion
And there you have it — an Ansible-based GitHub release package manager 😜. Obviously this is not going to replace pacman
or homebrew
anytime soon, but for simple CLI applications in heterogeneous environments I've found this to be a good-enough quick-and-dirty solution.